Heuristic Analysis and Usability Testing

The MathWorks’ Add-On Explorer interface

Overview

Description Worked as part of a team to perform heuristic analysis and usability testing for The Mathworks’ Add-On Explorer interface
Client The MathWorks [via Bentley University]
Date Fall 2017
Role UX Design
Skills Heuristic evaluation, usability testing, information architecture
Tools Remote usability testing via Zoom

The Challenge

The Add-On Explorer is an interface for finding and downloading official and community-developed plugins that provide additional functionality for MATLAB and Simulink software. The collection consists of almost 34,000 add-ons spanning a large number of domains and applications. Our focus was evaluating the usability of the current hierarchical organization scheme and general interface for new users of the platform.

Heuristic Analysis

The first phase of the project was a heuristic evaluation based on a combination of Jakob Neilsen’s 10 usability heuristics for general usability and the content analysis heuristics developed by Fred Liese to address the organization of the information hierarchy. Issues found were classified according to three levels of severity. Additionally, positive findings were included to indicate features that should be preserved in future iterations.

Overview of Findings

In relation to the very large, diverse, and complex database of items that this interface provides access to, the Add-On Explorer is quite effective, providing complementary browse and search paths for locating relevant add-ons. In general, there were no issues that would halt user progress entirely, but users may be overwhelmed, frustrated, or less efficient, particularly novices, due to the following:

  • Insufficient differentiation between and explanation of labelling terminology
  • Unclear scope and low “information scent” due to the complexity of the organizing categories and how they are displayed
  • Gaps in search coverage
  • Low visibility of certain options and product details
  • Inconsistency in the structure of categories list
Process overview

Usability Testing

After becoming more familiar with the interface and evaluating it ourselves, the second phase of the project involved usability testing with current MATLAB users.

Research Questions

  • What are users’ reactions to and impressions of the basic interaction of the category widget?
  • Do the categories and their structure make sense to users?
  • Can customers find a specific add-on that they know they want?
  • Can customers discover new add-ons that they were previously unaware of?

Test Plan

Participants were recruited based on duration and recency of MATLAB use. Participants were selected with a goal of representing use in a variety of disciplines. 1:1 sessions were conducted remotely via Zoom video conferencing software and relied on a think aloud protocol.

The session consisted of three main parts:

  1. Background questions about the participant’s field of work and previous experience with MATLAB, add-ons, and the Add-On Explorer
  2. 6 Tasks related to searching, browsing, and filtering add-ons
  3. Post-Test feedback and administration of a System Usability Scale (SUS) questionnaire

Selected Findings

Results

Testing confirmed several of the observations that arose during the heuristic analysis, including confusing nomenclature, filter and category ordering, and formatting inconsistencies. Observation during testing also provided general insights about user behavior, such as using the categories for determining scope as well as browsing, a strong reliance on first-page results, and low persistence in search, often mistakenly assuming their target did not exist. The combined findings of the usability testing and heuristic analysis were used to provide MathWorks with a prioritized list of recommendations for future updates that we invited to present to the full team of developers, designers, and researchers.