FitByte Uses Sensors on Eyeglasses To Automatically Monitor Diet
News
CMU Researchers Propose Multimodal System To Track Foods, Liquid Intake
Food plays a big role in our health, and for that reason many people trying to improve their diet often track what they eat. A new wearable from researchers in Carnegie Mellon University's School of Computer Science helps wearers track their food habits with high fidelity.
FitByte, a noninvasive, wearable sensing system, combines the detection of sound, vibration and movement to increase accuracy and decrease false positives. It could help users reach their health goals by tracking behavioral patterns, and gives practitioners a tool to understand the relationship between diet and disease and to monitor the efficacy of treatment.
The device tracks all stages of food intake. It detects chewing, swallowing, hand-to-mouth gestures and visuals of intake, and can be attached to any pair of consumer eyeglasses. "The primary sensors on the device are accelerometers and gyroscopes, which are in almost every device at this point, like your phones and your watches," said Mayank Goel, an assistant professor in the Institute for Software Research and Human-Computer Interaction Institute.
An infrared proximity sensor detects hand-to-mouth gestures. To identify chewing, the system monitors jaw motion using four gyroscopes around the wearer's ears. The sensors look behind the ear to track the flexing of the temporal muscle as the user moves their jaw. High-speed accelerometers placed near the glasses' earpiece perceive throat vibrations during swallowing. This technology addresses the longstanding challenge of accurately detecting drinking, and the intake of soft things like yogurt and ice cream.
A small camera at the front of the glasses points downward to capture just the area around the mouth and only turns on when the model detects the user eating or drinking. "To address issues of privacy, we're currently processing everything offline," said Abdelkareem Bedri, an HCII doctoral student. "The captured images are not shared anywhere except the user's phone."
At this point, the system relies on users to identify the food and drink in photos. But the research team has plans for a larger test deployment, which will supply the data deep learning models need to automatically discern food type.
FitByte was tested in five unconstrained situations including a lunch meeting, watching TV, having a quick snack, exercising in a gym and hiking outdoors. Modeling across such noisy data allows the algorithm to generalize across conditions.
"Our team can take sensor data and find behavior patterns. In what situations do people consume the most? Are they binge eating? Do they eat more when they're alone or with other people? We are also working with clinicians and practitioners on the problems they'd like to address," Goel said.
The team will continue developing the system by adding more noninvasive sensors that will allow the model to detect blood glucose levels and other important physiological measures. The researchers are also creating an interface for a mobile app that could share data with users in real time.
Other contributing researchers include CMU students Diana Li, Rushil Khurana and Kunal Bhuwalka. The paper was accepted by the Conference on Human Factors in Computing Systems (CHI 2020), which was scheduled for this month but canceled due to the COVID-19 pandemic. It's available in the ACM Digital Library.
For More Information