Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2017 Sep;1(3):37.
doi: 10.1145/3130902.

EarBit: Using Wearable Sensors to Detect Eating Episodes in Unconstrained Environments

Affiliations

EarBit: Using Wearable Sensors to Detect Eating Episodes in Unconstrained Environments

Abdelkareem Bedri et al. Proc ACM Interact Mob Wearable Ubiquitous Technol. 2017 Sep.

Abstract

Chronic and widespread diseases such as obesity, diabetes, and hypercholesterolemia require patients to monitor their food intake, and food journaling is currently the most common method for doing so. However, food journaling is subject to self-bias and recall errors, and is poorly adhered to by patients. In this paper, we propose an alternative by introducing EarBit, a wearable system that detects eating moments. We evaluate the performance of inertial, optical, and acoustic sensing modalities and focus on inertial sensing, by virtue of its recognition and usability performance. Using data collected in a simulated home setting with minimum restrictions on participants' behavior, we build our models and evaluate them with an unconstrained outside-the-lab study. For both studies, we obtained video footage as ground truth for participants activities. Using leave-one-user-out validation, EarBit recognized all the eating episodes in the semi-controlled lab study, and achieved an accuracy of 90.1% and an F1-score of 90.9% in detecting chewing instances. In the unconstrained, outside-the-lab evaluation, EarBit obtained an accuracy of 93% and an F1-score of 80.1% in detecting chewing instances. It also accurately recognized all but one recorded eating episodes. These episodes ranged from a 2 minute snack to a 30 minute meal.

Keywords: Applied computing →Health informatics; Computing methodologies →Supervised learning by classification; Hardware →Sensor devices and platforms; Human-centered computing →Ubiquitous and mobile computing design and evaluation methods; Wearable computing; activity recognition; automatic dietary monitoring; chewing detection; earables; unconstraint environment.

PubMed Disclaimer

Figures

Fig. 1
Fig. 1
EarBit’s data collection prototype with multiple sensors. Our semi-controlled and Outside-the-Lab evaluations show that the Behind-the-Ear IMU is enough to achieve usable performance. We envision such a sensor to be part of future eyeglasses or augmented reality head-mounted displays.
Fig. 2
Fig. 2
Outside-the-lab study configuration: a) A user wearing the EarBit system and GoPro camera. b) A picture from the GoPro camera of the user working at a desk. c) A picture from the GoPro camera of the user eating with a pair of chopsticks.
Fig. 3
Fig. 3
An example of annotations for eating activity. We annotated our video data at a 1-second resolution. In this 600-second example of a user having a meal, we capture all minute transitions and capture various 2-second intervals where the user stopped chewing. Mixed activities would have overlapping annotations as indicated in the example of walking and talking. For all the instances when the user is not moving a stationary label is also added.
Fig. 4
Fig. 4
Example data from the y-axis of the behind-the-ear gyroscope. The dots indicate local maxima with high energy in the signal. As compared to talking, the peaks for eating are more periodic and ”spiky”.
Fig. 5
Fig. 5
Flowchart for initial evaluation of the multi-sensor setup
Fig. 6
Fig. 6
Comparison between sensing modalities. E = behind-the-ear IMU, P = outer-ear proximity sensor, M = neck microphone. The back IMU is used in all condition to detect if the user was walking. The performance of behind-the-ear IMU (E) was most consistent for all three metrics. It was also considered most comfortable to wear by the users.
Fig. 7
Fig. 7
Flowchart for EarBit algorithm
Fig. 8
Fig. 8
An example of conversion of confidence values from Random Forests to frame-level results (chewing) and then to event-level predictions (eating episodes).
Fig. 9
Fig. 9
Chewing recognition results for semi-controlled lab

References

    1. Amft Oliver, Stäger Mathias, Lukowicz Paul, Tröster Gerhard. Proceedings of the 7th International Conference on Ubiquitous Computing (UbiComp’05) Springer-Verlag; Berlin, Heidelberg: 2005. Analysis of Chewing Sounds for Dietary Monitoring; pp. 56–72. - DOI
    1. Amft O, Troster G. Methods for Detection and Classification of Normal Swallowing from Muscle Activation and Sound. 2006 Pervasive Health Conference and Workshops. 2006:1–10. doi: 10.1109/PCTHEALTH.2006.361624. - DOI
    1. Amft Oliver, Tröster Gerhard. Recognition of dietary activity events using on-body sensors. Artificial intelligence in medicine. 2008;42(2):121–136. 2008. - PubMed
    1. Bao Ling, Intille Stephen S. International Conference on Pervasive Computing. Springer; 2004. Activity recognition from user-annotated acceleration data; pp. 1–17.
    1. Bartholome Lindsay T, Peterson Roseann E, Raatz Susan K, Raymond Nancy C. A comparison of the accuracy of self-reported intake with measured intake of a laboratory overeating episode in overweight and obese women with and without binge eating disorder. European journal of nutrition. 2013;52(1):193–202. 2013. - PMC - PubMed