Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Mar 25;8(3):e15294.
doi: 10.2196/15294.

Volumetric Food Quantification Using Computer Vision on a Depth-Sensing Smartphone: Preclinical Study

Affiliations

Volumetric Food Quantification Using Computer Vision on a Depth-Sensing Smartphone: Preclinical Study

David Herzig et al. JMIR Mhealth Uhealth. .

Abstract

Background: Quantification of dietary intake is key to the prevention and management of numerous metabolic disorders. Conventional approaches are challenging, laborious, and lack accuracy. The recent advent of depth-sensing smartphones in conjunction with computer vision could facilitate reliable quantification of food intake.

Objective: The objective of this study was to evaluate the accuracy of a novel smartphone app combining depth-sensing hardware with computer vision to quantify meal macronutrient content using volumetry.

Methods: The app ran on a smartphone with a built-in depth sensor applying structured light (iPhone X). The app estimated weight, macronutrient (carbohydrate, protein, fat), and energy content of 48 randomly chosen meals (breakfasts, cooked meals, snacks) encompassing 128 food items. The reference weight was generated by weighing individual food items using a precision scale. The study endpoints were (1) error of estimated meal weight, (2) error of estimated meal macronutrient content and energy content, (3) segmentation performance, and (4) processing time.

Results: In both absolute and relative terms, the mean (SD) absolute errors of the app's estimates were 35.1 g (42.8 g; relative absolute error: 14.0% [12.2%]) for weight; 5.5 g (5.1 g; relative absolute error: 14.8% [10.9%]) for carbohydrate content; 1.3 g (1.7 g; relative absolute error: 12.3% [12.8%]) for fat content; 2.4 g (5.6 g; relative absolute error: 13.0% [13.8%]) for protein content; and 41.2 kcal (42.5 kcal; relative absolute error: 12.7% [10.8%]) for energy content. Although estimation accuracy was not affected by the viewing angle, the type of meal mattered, with slightly worse performance for cooked meals than for breakfasts and snacks. Segmentation adjustment was required for 7 of the 128 items. Mean (SD) processing time across all meals was 22.9 seconds (8.6 seconds).

Conclusions: This study evaluated the accuracy of a novel smartphone app with an integrated depth-sensing camera and found highly accurate volume estimation across a broad range of food items. In addition, the system demonstrated high segmentation performance and low processing time, highlighting its usability.

Keywords: computer vision; depth camera; dietary assessment; smartphone.

PubMed Disclaimer

Conflict of interest statement

Conflicts of Interest: None declared.

Figures

Figure 1
Figure 1
Automated food quantification workflow performed by the SNAQ app.
Figure 2
Figure 2
Bland-Altman plot illustrating the difference between the estimated and reference meal weights.
Figure 3
Figure 3
Bland-Altman plot illustrating the difference between the estimated and reference carbohydrate content of the meals.
Figure 4
Figure 4
Bland-Altman plot illustrating the difference between the estimated and reference protein content of the meals.
Figure 5
Figure 5
Bland-Altman plot illustrating the difference between the estimated and reference fat content of the meals.
Figure 6
Figure 6
Bland-Altman plot illustrating the difference between the estimated and reference energy content of the meals.
Figure 7
Figure 7
Box-plot of the processing time according to meal type. Box-plots show median values (solid line), interquartile range (IQR; box outline), spread of data points without outliers (whiskers) and outliers identified as 1.5*IQR (symbols).

References

    1. American Diabetes Association Evidence-based nutrition principles and recommendations for the treatment and prevention of diabetes and related complications. Diabetes Care. 2002 Jan;25(1):202–12. doi: 10.2337/diacare.25.1.202. - DOI - PubMed
    1. Raynor HA, Champagne CM. Position of the Academy of Nutrition and Dietetics: Interventions for the Treatment of Overweight and Obesity in Adults. J Acad Nutr Diet. 2016 Jan;116(1):129–47. doi: 10.1016/j.jand.2015.10.031. - DOI - PubMed
    1. Livingstone MB, Prentice AM, Strain JJ, Coward WA, Black AE, Barker ME, McKenna PG, Whitehead RG. Accuracy of weighed dietary records in studies of diet and health. BMJ. 1990 Mar 17;300(6726):708–12. doi: 10.1136/bmj.300.6726.708. http://europepmc.org/abstract/MED/2386561 - DOI - PMC - PubMed
    1. Ordabayeva N, Chandon P. In the eye of the beholder: Visual biases in package and portion size perceptions. Appetite. 2016 Aug 01;103:450–457. doi: 10.1016/j.appet.2015.10.014. - DOI - PubMed
    1. Beasley J, Riley WT, Jean-Mary J. Accuracy of a PDA-based dietary assessment program. Nutrition. 2005 Jun;21(6):672–7. doi: 10.1016/j.nut.2004.11.006. - DOI - PubMed

Publication types