Modular Bayesian Networks with Low-Power Wearable Sensors for Recognizing Eating Activities
- PMID: 29232937
- PMCID: PMC5751632
- DOI: 10.3390/s17122877
Modular Bayesian Networks with Low-Power Wearable Sensors for Recognizing Eating Activities
Abstract
Recently, recognizing a user's daily activity using a smartphone and wearable sensors has become a popular issue. However, in contrast with the ideal definition of an experiment, there could be numerous complex activities in real life with respect to its various background and contexts: time, space, age, culture, and so on. Recognizing these complex activities with limited low-power sensors, considering the power and memory constraints of the wearable environment and the user's obtrusiveness at once is not an easy problem, although it is very crucial for the activity recognizer to be practically useful. In this paper, we recognize activity of eating, which is one of the most typical examples of a complex activity, using only daily low-power mobile and wearable sensors. To organize the related contexts systemically, we have constructed the context model based on activity theory and the "Five W's", and propose a Bayesian network with 88 nodes to predict uncertain contexts probabilistically. The structure of the proposed Bayesian network is designed by a modular and tree-structured approach to reduce the time complexity and increase the scalability. To evaluate the proposed method, we collected the data with 10 different activities from 25 volunteers of various ages, occupations, and jobs, and have obtained 79.71% accuracy, which outperforms other conventional classifiers by 7.54-14.4%. Analyses of the results showed that our probabilistic approach could also give approximate results even when one of contexts or sensor values has a very heterogeneous pattern or is missing.
Keywords: Bayesian network; context-awareness; human activity recognition; mobile application; wearable computing.
Conflict of interest statement
The authors declare no conflict of interest.
Figures
References
-
- Testoni V., Penatti O.A.B., Andaló F.A., Lizarraga M., Rittner L., Valle E., Avila S. Guest editorial: Special issue on vision-based human activity recognition. J. Commun. Inf. Syst. 2015;30:58–59. doi: 10.14209/jcis.2015.7. - DOI
-
- Tian L., Sigal L., Mori G. Social roles in hierarchical models for human activity recognition; Proceedings of the Computer Vision and Pattern Recognition; Providence, RI, USA. 16–21 June 2012.
-
- Casale P., Pujol O., Radeva P. Human activity recognition from accelerometer data using a wearable device; Proceedings of the Pattern Recognition and Image Analysis; Las Palmas de Gran Canaria, Spain. 8–10 June 2011.
-
- Liu L., Peng Y., Wang S., Liu M., Huang Z. Complex activity recognition using time series pattern dictionary learned from ubiquitous sensors. Inf. Sci. 2016;340:41–57. doi: 10.1016/j.ins.2016.01.020. - DOI
-
- Jatoba L.C., Grossmann U., Kunze C., Ottenbacher J., Stork W. Context-aware mobile health monitoring: Evaluation of different pattern recognition methods for classification of physical activity; Proceedings of the IEEE Annual Conference of Engineering in Medicine and Biology Society; Vancouver, BC, Canada. 20–25 August 2008. - PubMed
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources
