Four projects from the HCII's Future Interfaces Group (FIG Lab) were honored by Fast Company's 2020 Innovation by Design Awards.
Food plays a big role in our health, and for that reason many people trying to improve their diet often track what they eat. A new wearable from researchers in Carnegie Mellon University's School of Computer Science helps wearers track their food habits with high fidelity.
Today's virtual reality systems can create immersive visual experiences, but seldom do they enable users to feel anything — particularly walls, appliances and furniture. A new device developed at Carnegie Mellon University, however, uses multiple strings attached to the hand and fingers to simulate the feel of obstacles and heavy objects.
Algorithm Enables Cameras To Recognize Distinctive Exercise Motions
Wearable sensors such as smartwatches have become a popular motivational tool for fitness enthusiasts, but gadgets do not sense all exercises equally. Researchers at Carnegie Mellon University have found that a stationary camera is a better choice for gym exercises.
We've become accustomed to our smartwatches and smartphones sensing what our bodies are doing, be it walking, driving or sleeping. But what about our hands? It turns out that smartwatches, with a few tweaks, can detect a surprising number of things your hands are doing.
Thousands of the world’s top researchers, scientists, and designers are traveling to the ACM CHI Conference on Human Factors in Computing Systems (also known as CHI) this weekend. The premier international conference of Human-Computer Interaction will take place in Glasgow, UK from May 4-9, 2019.
“It’s like all of March Madness in one weekend,” said Patrick Carrington, postdoctoral research fellow at the Human-Computer Interaction Institute, of the National Wheelchair Basketball Tournament (NWBT) in Louisville, Kentucky.