Design

Team talking with designers at Bloomberg during design critic session.

VISIONING & SCOPING

After considering Bloomberg’s interests, prior developments in the accessibility space, and the feasibility of projects given our time span, we chose to focus on the insight, “It’s impossible for me to get the ‘gist’ of the page.” All 5 people with visual impairments we spoke to told us that they struggle with ‘getting the gist' of text, visual elements like images and graphs, and/or keeping track of windows on their system. We brainstormed over 70 possible ideas to solve these problems before deciding to focus on improving the gist of visual elements, specifically graphs and charts. Graphs are essential to the Bloomberg Terminal. They are typically so inaccessible to people with visual impairments that it impedes their pursuit of STEM-related careers, like finance.

Images from Data Driven DJ's connected project and Ed Summers iBooks chart.

COMPETITIVE ANALYSIS

While we discussed how we might communicate the smell and taste of the stock market, we ultimately focused on tactile, haptic, and auditory methods. After further research, we chose to pursue an auditory solution so that users wouldn't need to purchase or install additional hardware - they could use their computer or mobile devices.

We conducted a competitive analysis to understand what work has already been done with respect to audio graphs. Our research led us to the Sonification Sandbox at Georgia Tech , which takes in static data and returns customizable audio output of a single line chart. We also discovered the Data Driven DJ , from a developer at the New York Public Library, which explores creative ways to sonify data about New York City. Finally, we reviewed the work of Ed Summers, an industry accessibility expert, whose company creates electronic books with audio graphs of static data on different charts. His technology is the most cutting-edge that we identified in the data sonification space.

Team member writing questions and ideas on white paper during brainstorming session.

PROBLEM SPACE

Line graphs are commonly found in the Terminal, so we focused on understanding why people find line graphs valuable and how they generally interact with them. We spoke with people with visual impairments about how they first learned to use charts and graphs before researching further.

We conducted graph expert interviews with internal engineers and designers who work on building charts to learn about their design approach and client graph useage. We also interviewed people who regularly use charts on the Terminal to understand what information they look for in line graphs and how they quickly gather that information.

We conducted 2-minute think-aloud sessions with Bloomberg employees in which we asked participants to look at a line graph briefly and explain what information they look for and the actions they perform. We synthesized these results into a keyword heat map to highlight what values are most important for conveying the gist of the graph.

ITERATIVE DESIGN

Phase One

Inital Design Ideas

We presented 11 early ideas as paper and Keynote prototypes to members of the Bloomberg UX team. We also ran our ideas by a Bloomberg employee with a visual impairment to find out which concepts she was most receptive to. From there, we moved forward with ideas relating to helping users with visual impairments get the gist of Graphical User Interfaces (GUIs) or different tpes of data representations. We created prototypes for these 5 specific concepts:

  1. Audio representation of line graphs through computer generated pitch
  2. Ambient sound of changing stock prices in a table
  3. Keyboard shortcuts to listen to line graphs
  4. Tactile line chart
  5. Hearing elements of the GUI spatially

Testing

We recruited two engineers from Bloomberg to test our initial 5 design concepts. Each of them completed tasks using the prototypes and spoke aloud about their thoughts.


Observations
  • Participants understood tactile charts quickly
  • Participants understood sonified charts, though with less comfort than tactile charts
  • Ambient sound could be incorporated into a data sonification prototype


Results

Although participants had an affinity for tactile charts of static data, we found that designing a new, dynamic tactile interface was not feasible given our time constraints. We therefore chose to focus on data sonification of stock price line graphs.

Phase Two


Image of usbaility lab set up with laptop, trackpad, and keyboard.

After some of our earlier testing, we realized that what other projects in the sonification space were missing was an interactive component. Rather than pressing a button to play the audio version of a graph, we sought to create a tool that could allow someone with a visual impairment to piece together a more wholesome understanding of a dataset. SO, we developed and tested 3 prototypes:

  • A desktop application that allowed users to interact with an audio graph via touch on a computer trackpad. Similar to how you can scrub through audio or video clips in editing software or YouTube videos, our prototype enabled users to drag a finger left and right along a trackpad at any pace to play a sonified line of stock prices. The horizontal movement across the trackpad traversed X-values. The rising and falling pitch of audio tones corresponded to Y-values. Users could also press the spacebar at any point to have a screenreader read out the price and date at that point.
  • An application that allowed people to use keyboard shortcuts to play the trend of a line and reveal important summary information such as highest, lowest, and current price.
  • A system to test sound preferences in which we used an open source tool built by the Sonification Lab at Georgia Tech to experiment with different sounds for line graphs. We could sonify lines using different instruments (ex: saxophone) and tweak the volume, speed, and other qualities of the sound.

Research Questions:

Track-Pad Scrubbing

  1. Track-Pad Scrubbing:
  • Are participants able to understand and perform the scrubbing interaction on a track-pad?
  • Are participants able to gather the gist of a line graph by using the track-pad scrubbing interaction?
  • Are participants able to gather the specific values at points and identify points of drastic change?

Keyboard Audio Navigation

  1. Keyboard Audio Navigation:
  • Are participants able to gather the gist of a line graph by using the keyboard shortcuts?
  • Does the ability to navigate through specific values on the graph help participants better comprehend the graph?
  • Are participants able to retain important information from the graph?
  • Do participants prefer a keyboard navigation system to the trackpad interaction?

Sonification Sound Preference

  1. Sound Preference:
  • What sound specifications do participants prefer?
  • How accurately are participants able to comprehend the audio graph?

Testing

Recruiting

Because we had limited access to participants with complete visual impairment, we tested with 6 sighted employees as proxy participants. We asked them to interact with the prototypes without any visual interface and answer questions based solely on what they heard.

Pilot Test

We first ran a pilot test with our client, after which we finalized our prototypes and protocol, including the post-test questionnaire and the order of tasks.

Main Test

In each session, we conducted a pre-test questionnaire and went through tasks for each the 3 prototypes in a 30 minute time span in the Bloomberg Usability Laboratory. We asked participants to reproduce the audio they heard as a drawing on paper to find out how well the prototypes were accurately conveying information. In each session, we had a moderator and four other researchers observing through a one-way mirror and taking notes.

Key Observations
  • participants who used the scrubbing prototype were more successful at quickly drawing the "gist" of the graph. While participants who used the keyboard prototype took longer but drew more detailed and refined graphs.
  • Participants wanted information beyond the trend line; they wanted to know exact values that went along with the scrubbing interaction, as well as summary information about axes, date ranges, highs, and lows.
  • Participants preferred a more joyful sound such as the steel drum option, for listening to the sonified graph.
  • Mixing trackpad and keyboard interactions was not effective; participants using the trackpad often forgot to press the associated space bar; some participants reported having forgotten about the interaction.
  • Participants found the trackpad's relative spatial positioning confusing. If they touched the top right corner of the trackpad, they expected to be on the top right corner of the graph. Instead, their position moved slightly northeast of their previous position, often to a place far away from the top right corner of the graph.
Results

Based on these observations, we decided to combine the positive elements from the two interactive prototypes into a single prototype. We therefore combined the interactive "scrubbing" control for line audio with the key data values from the keyboard shortcuts into a singular prototype.

To address the problem of trackpad disorientation, we chose to switch from using a desktop trackpad to using a touchscreen. The absolute spatial mapping of smartphone and tablet devices would allow users to drag their finger across with the graph without losing their place. We specifically chose iOS over Android because iOS is widely used in the blind community.

Phase Three

Early Prototype

In phase 3, we analyzed our problem space research to prioritize features for this tool:

  • Move through sonified line graph
  • Play an "overview" of the graph audio
  • Read out value at finger's point on graph
  • Indicate edges of the graph
  • Read out current price
  • Read out start price
  • Read out high/low price
  • Read out Title
  • Read out X and Y axis



We brainstormed and assigned a set of gestures to provide the information for each of those features.

We built our prototype using the Swift programming language as an iOS application. A double-tap at the top and bottom of the screen read high and low values, respectively; a double-tap to the left of the screen read the starting value, and to the right of the screen read the current value. A single tap anywhere on the screen read out the price and date of that point on the graph.

Testing

We tested our prototype on 1 participant with severe a visual impairment and 3 participants who are blind.

Main test

We asked participants to think-aoud as they explored the graph, used gestures, and completed tasks on both an iPad and an iPhone.

Results

Participants correctly described the general movement of line graphs and identified when values had spiked.

Participants reacted positively to being able to read out specific data points along the graph.

Participants found some gestures confusing. For example, our team had to explain what double tapping at the top and bottom of the screen meant multiple times before participants became comfortable. One participant specifically mentioned a switch from using custom gestures to gestures already used in VoiceOver, Apple's built-in screen reader technology.

We continued to refine our work based on feedback from participants. To see our final prototype, which is a culmination of several rounds of design and testing, please click here.

About

Sonify- Making graphs accessible. For our MHCI Capstone project we partnered with Bloomberg L.P. to investigate computer accessibility in desktop applications for people with disabilities.

Sponsors

Carnegie Mellon University
Bloomberg L.P.