VizLens
2016
Faculty
VizLens: A Robust and Interactive Screen Reader for Interfaces in the Real World
The world is full of physical interfaces that are inaccessible to blind people, from microwaves and information kiosks to thermostats and checkout terminals. Blind people cannot independently use such devices without at least first learning their layout, and usually only after labeling them with sighted assistance.
VizLens is an accessible mobile application and supporting backend that can help blind people use nearly any interface they encounter by providing accurate and real-time feedback.
VizLens users capture a photo of an inaccessible interface and send it to multiple crowd workers, who work in parallel to quickly label and describe elements of the interface to make subsequent computer vision easier. This technology solves a long-standing challenge in accessibility by deeply integrating crowdsourcing and computer vision, and foreshadows a future of increasingly powerful interactive applications that would be currently impossible with either alone.
Link to Project
VizLens paper (pdf)
Researchers
Jeffrey Bigham
Students
Xiang Anthony Chen,
Research Areas
Accessibility, Crowdsourcing