I believe every human has a finite number of heartbeats.
I don't intend to waste any of mine.


- Neil Armstrong

MISSION
To accelerate astronaut workflow aboard the International Space Station.
Background

After years of assembly, module by module, the International Space Station (ISS) is finally complete.

Now, the focus turns to payload science – performing scientific experiments in zero gravity. Crew members have schedules planned out down to the minute with such tasks, and must follow procedures to perform tasks efficiently and effectively.

Problem

The International Procedure Viewer (IPV) prevent crew members from operating at maximum efficiency.

Procedures within the IPV are static and do not represent the ad hoc nature of the actual tasks. More visual assistance is also necessary, especially for complex procedures. Furthermore, the IPV is stationary, forcing crew members to travel back and forth throughout the ISS.

Goal

We are designing, prototyping, and building a procedure viewing device that:

  • Streamlines information displays
  • Adapts procedure based on context
  • Allows crew member mobility
  • Provides means for hands-free interaction

RESEARCH
Understanding task execution in isolated environments using scientific, maintenance and procedure viewing tools.

OVERVIEW

After completing a literature review of our problem space and a competitive analysis of procedure viewers, we dug into the meat of our research: domain-specific and analogous user research.

Using UX methods like contextual inquiry, directed storytelling, and card sorting, we interviewed 27 domain experts:

  • 2 Astronauts
  • 9 NASA employees
  • 5 Pilots
  • 4 Deep sea divers
  • 2 Paramedics
  • 2 Lab technicians
  • 2 Construction managers
  • 1 Pit crew member

PRELIMINARY

Literature Review

Our literature review focused on the following:

  • Psychology of space flight
  • Structure of ISS/living in space
  • Organization and presentation of procedures
  • Cognitive processing of tasks
  • Collaboration

The articles we read gave us a number of critical insights that informed our approach to user research, such as:

  • Isolation and lack of on-the-ground normalcy can increase anxiety, insomnia, and depression in crew members.
  • Procedure display methods proven effective across various domains include checklists, augmented reality displays, and in-context displays.
  • High mental workload can decrease performance, especially when interruptions are forced during those periods of heightened workload.

Competitive Analysis
  • Our competitive analysis looked at existing procedure viewers, including the procedure viewer currently being used on the ISS.
  • The European Space Agency's WEAR, a head-mounted augmented reality display, overlays instructions in the operator's field of vision.
  • Pilots use a central monitoring system that automatically provides pilots with procedure steps based off of current aircraft parameters.
  • Medical procedure viewers combine audio, video, and textual information for understandability under life-threatening conditions.

DOMAIN-SPECIFIC

"Most the time, first-time experiences don’t work out very well... I got about 3 hours sleep a night that whole mission."
- Astronaut

Our research began and ended with astronaut interviews, with both having served as commanders. One had worked on the Mir space station, whereas the second was on board the ISS as recent as 6 months before the interview.

At the Johnson Space Center in Houston, we interviewed experts within Operations Planning, Payload Science, Human Factors, Next Generation Scheduling, Wearable Computing, and Flight Deck of the Future. We also toured the Space Vehicle Mockup Facility and the Neutral Buoyancy Laboratory.

  • We interviewed two astronauts - one by phone and one in-person. We got the chance to get a question answered by a third one through YouTube.
  • Visiting the Space Vehicle Mockup Facility, which houses a full-size mock-up of the ISS, provided us with a sense of scale regarding astronaut movement onboard.
  • A full-sized mock-up of the ISS is submerged within the pool at the Nuetral Buoyancy Laboratory, where astronauts train for the weightless environment of space.
  • Looking at E-textiles, which are fabrics that enable small computers to be embedded within them, inspired us to explore possible solutions within wearable computing.
  • A foot controlled flight simulator was found to be quite intuitive to use, encouraging us to think about incorporating novel interaction methods.

ANALOGOUS DOMAINS

"Comply exactly with the procedure, or cause the procedure to be officially changed."
- Commercial Pilot

Due to the expected difficulty in accessing astronauts, we investigated domains with parallels in workflow and environment. We focused on the following five metrics:

  • Is documentation required while completing a task?
  • Are there a variety of tasks present?
  • Does the work involve ad hoc tasks?
  • Are tasks executed in an isolated environment?
  • Does the work require the use of tools?


  • We talked to pilots, deep sea divers, lab technicians, paramedics, construction managers, and pit crews based on the similarities their tasks shared with payload science and maintenance work.
  • We performed an interview and concept mapping session with two commercial cargo pilots. Afterwards, we went into the cockpit of an Airbus A310 to study their equipment and documentation.
  • We were given a full tour of a diving facility by three deep sea divers. A whiteboarding session followed, with the divers diagramming out the processes involved with decompression and working dives.
  • We performed contextual inquiries with two lab technicians at a disease diagnostic research facility in Seattle.
  • We conducted a contextual inquiry with a paramedic, riding along in the back of an ambulance during an actual emergency call.

DESIGN
Introducing PEER, a solution that enhances the display of procedures while addressing portable, hands-free usage.

PEER

Based on our user needs, we developed PEER, a Procedure Execution with Enhanced Reality system for head-mounted displays.

PEER is an Android application designed for the Epson Moverio head-mounted display, using voice control to provide users with a completely hands-free experience.

Core features include:

  • Visualizing information and directions using images, videos, and augmented reality
  • Navigating through steps in a highly granular fashion
  • Displaying information when and where it is needed
  • Accelerating data analysis using object recognition
  • Modularizing procedure authoring with an extensible XML-based paradigm

VISIONING

After synthesizing our research, we used our findings to jumpstart our ideation process.

We let our ideas go broad. Unwilling to let technological constraints limit our scope, we generated over four hundred initial design ideas. These were later consolidated into a few distinct visions that we hoped to pursue.

After presenting our ideas at the NASA Ames Research Center, we narrowed down our design direction through a series of visioning activities with our clients. After further research at the Augmented World Expo, we ultimately decided to pursue procedure execution on the head-mounted display.

  • We synthesized our research by using affinity diagrams and consolidated models to find patterns within our data.
  • Using our research as a jumping-off point, we came up with four hundred initial design concepts and pursued the concepts we found most compelling.
  • We used storyboards to help visualize our use cases. This storyboard, focusing on the use of augmented reality and head-mounted displays, led the way to PEER.
  • We engaged in a physical prototyping activity with our NASA clients to help figure out the advantages and disadvantages between various hardware platforms.
  • Our Kano models, created by our clients, involved taking a set of features and grouping them based on importance. This helped us determine our core feature set.

PROTOTYPING

Designing for an head-mounted display was a challenge, but our rapid iteration cycle quickly finetuned our interactions.

We began our prototyping process by creating a simulated ISS environment, as well as an analog testing procedure. With these, we conducted usability tests with 24 participants across 6 iterations of our interface.

In order to adequately test our interface, we had to simulate the visual experience of a see-through, wearable interface that follows your gaze and listens to your voice. This forced us to devise new prototyping methods, such as using transparencies or remotely controlling our head-mounted display through a secondary device.

  • Our simulated ISS environment contained several modules, stowage lockers, and foam "filters" attached to the walls.
  • For low-fidelity, we created our interface on transparencies, and had two human "computers" control the interface whenever the participant uttered a voice command.
  • For medium-fidelity, we began displaying our interface on the Epson Moverio. Screens would be updated remotely by a laptop whenever the participant voiced a command.
  • In order to compensate for the Moverio's lack of sensors, we paired it with a tablet to utilize the tablet's microphone and camera.

FUTURE VISION

Our prototype was limited by the hardware available and our timeframe. However, given the necessary resources, we envision PEER addressing the following:

Full Field-of-View
The form factor for our future vision would allow for seamlessly integrated, full field-of-view display of information. Allowing for real data overlays drastically increases the effectiveness of any augmented reality feature implemented into our system.
Metadata for Tools
Attaching metadata to tools can help resolve issues of tool identifcation and retrieval, especially if paired with object recognition. Information concerning the tool's usage, custom notes would decrease the need for stowage notes and allow crew members to be more productive.
Digital Information Widgets
Our future vision allows for digital widgets to be "attached" to real-world objects, like notes, timers for time-sensitive procedure steps, and beacons for easy identification. In this way, phsyical clutter on the station would be reduced and important information would be displayed completely in-context.
Data Input through Computer Vision
Payload science procedures often require the astronaut to record results of experiments. This process can be accelerated by utilizing computer vision algorithms to analyze and input these measurements in real-time.
Natural Language Interaction
Allowing for recognition of spoken phrases or ideas instead of commands would help users achieve their goals without the need to memorize commands and navigation flows.
Annotation Capability
Including fully-developed interactions for taking and viewing notes, especially image-based or video-based notes, would allow for more personalization of procedures and the increased effectiveness of procedural content.

ABOUT
We love working together.
Stephen Trahan
project lead

Stephen holds a Bachelor's in psychology from the University of South Florida. Stephen enjoys spending time with his wife, computer games, and learning new acronyms.

Kristina Lustig
research lead

Kristina has a Bachelor's in linguistics from New York University. As a Pittsburgh native, she's excited to finally live in a place where she can see the sun more than twice a year.

Jenn Tran
design lead

Jenn has a Bachelor's in history, with minors in chemistry and sociology from the University of California, San Diego. After the program, she wants to trek through the Himalayas.

Gordon Liu
design lead

Gordon holds a Bachelor's in both computer science and business administration from the University of Washington. In his free time, he enjoys playing basketball and computer games.

Chris Mueller
technical lead

Chris earned a degree in Computer Science from Loyola Marymount University. Chris enjoys music, playing beach volleyball, and is a die-hard fan of the Arizona Cardinals.