Guo Receives Honorable Mention at MobileHCI 2016

September 16, 2016
Guo with Award

Anhong Guo, a Ph.D. student in the Human-Computer Interaction Institute (HCII), received an Honorable Mention at the 2016 MobileHCI conference in Florence, Italy. His paper, "Exploring Tilt for No-Touch, Wrist-Only Interactions on Smartwatches," presented two tilt-based interaction techniques for smartwatch users. Neither technique would require a hand to manipulate a touchscreen or interact with hardware buttons, creating a truly hands-free experience.

Building Hands-Free Interactions

Guo, who conducted the majority of the research for the paper during a summer internship with Microsoft Research, explained that the smartwatch comes with an advantage over a handheld device, like the smartphone, as it is able to leave at least one of a user's hands free. Unfortunately, this benefit is all but cancelled out with the interaction requirements of standard smartwatches, which rely heavily on the manipulation of a touchscreen or hitting physical buttons.

"Worn on the wrist, smartwatches do not require users to hold the device, leaving at least one hand free to engage in other activities. Unfortunately, this benefit is thwarted by the typical interaction model of smartwatches," explained the paper.

"The goal of our research is to promote more “hands-free” interactions so that smartwatch users can engage in a multitude of activities that define mobility, from carrying objects such as a briefcase or a cup of coffee, to performing tasks such as opening a door..."

Leveraging Your Wrist

In his paper with Tim Paek (not pictured) from Microsoft Research, Guo presents no-touch interactions for smartwatches through wrist movements, rather than touchscreen interactions. Their paper introduces these two interaction techniques, AnglePoint and ObjectPoint, which use built-in technology for Android Wear operating systems.

Smartwatch gestures

The interactions allow users to navigate a menu and click on options with only wrist movements. On a music application, a user can start or pause a song, advance to the next track or to a previous track, and adjust the volume using only these techniques.

Guo and Paek believe that ObjectPoint, which uses an object imbued with a physics model as the underlying virtual pointer, provide users with a better understanding of how their tilt motions would translate into action.

To read the complete paper, you can access it on the ACM library. For more information about mobile and wearable computing, visit more articles on the HCII site.

View Full Paper