ABSTRACT
Unhealthy diet is a top risk factor causing obesity and numerous chronic diseases. To help the public adopt healthy diet, nutrition scientists need user-friendly tools to conduct Dietary Assessment (DA). In recent years, new DA tools have been developed using a smartphone or a wearable device which acquires images during a meal. These images are then processed to estimate calories and nutrients of the consumed food. Although considerable progress has been made, 2D food images lack scale reference and 3D volumetric information. In addition, food must be sufficiently observable from the image. This basic condition can be met when the food is stand-alone (no food container is used) or it is contained in a shallow plate. However, the condition cannot be met easily when a bowl is used. The food is often occluded by the bowl edge, and the shape of the bowl may not be fully determined from the image. However, bowls are the most utilized food containers by billions of people in many parts of the world, especially in Asia and Africa. In this work, we propose to premeasure plates and bowls using a marked adhesive strip before a dietary study starts. This simple procedure eliminates the use of a scale reference throughout the DA study. In addition, we use mathematical models and image processing to reconstruct the bowl in 3D. Our key idea is to estimate how full the bowl is rather than how much food is (in either volume or weight) in the bowl. This idea reduces the effect of occlusion. The experimental data have shown satisfactory results of our methods which enable accurate DA studies using both plates and bowls with reduced burden on research participants.
- Forouzanfar M. H., Alexander L., Anderson H. R., Bachman V. F., Biryukov S., Brauer M., Burnett R., Al. Et, and Factors Gbd 2013 Risk. 2015. Global, regional, and national comparative risk assessment of 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks in 188 countries, 1990--2013: A systematic analysis for the global burden of disease study 2013. Lancet. vol. 386, pp. 2287--2323.Google ScholarCross Ref
- Thompson Frances E. and Subar Amy F., "Chapter 1 - dietary assessment methodology," in Nutrition in the prevention and treatment of disease (fourth edition), Ann M. Coulston, Carol J. Boushey, Mario G. Ferruzzi, and Linda M. Delahanty,Eds.: Academic Press, 2017, pp. 5--48.Google Scholar
- Bailey R. L. 2021. Overview of dietary assessment methods for measuring intakes of foods, beverages, and dietary supplements in research studies. Curr Opin Biotechnol. vol. 70, pp. 91--96.Google ScholarCross Ref
- Subar A. F., Kirkpatrick S. I., Mittl B., Zimmerman T. P., Thompson F. E., Bingley C., Willis G., Islam N. G., Baranowski T., Mcnutt S., and Potischman N. 2012. The automated self-administered 24-hour dietary recall (ASA24): A resource for researchers, clinicians, and educators from the national cancer institute. J Acad Nutr Diet. vol. 112, pp. 1134--1137.Google ScholarCross Ref
- Schembre S. M., Liao Y., O'connor S. G., Hingle M. D., Shen S. E., Hamoy K. G., Huh J., Dunton G. F., Weiss R., Thomson C. A., and Boushey C. J. 2018. Mobile ecological momentary diet assessment methods for behavioral research: Systematic review. JMIR Mhealth Uhealth. vol. 6, p. e11170.Google ScholarCross Ref
- Foster E., Lee C., Imamura F., Hollidge S. E., Westgate K. L., Venables M. C., Poliakov I., Rowland M. K., Osadchiy T., Bradley J. C., Simpson E. L., Adamson A. J., Olivier P., Wareham N., Forouhi N. G., and Brage S. 2019. Validity and reliability of an online self-report 24-h dietary recall method (intake24): A doubly labelled water study and repeated-measures analysis. J Nutr Sci. vol. 8, p. e29.Google Scholar
- Baranowski T., "24-hour recall and diet record methods," in Nutritional epidemiology, W Willett,Eds. New York, NY: Oxford University Press, 2012.Google Scholar
- Poslusna K., Ruprich J., De Vries J. H., Jakubikova M., and Van't Veer P. 2009. Misreporting of energy and micronutrient intake estimated by food records and 24 hour recalls, control and adjustment methods in practice. Br J Nutr. vol. 101 Suppl 2, pp. S73--85.Google Scholar
- Kipnis V., Midthune D., Freedman L., Bingham S., Day N. E., Riboli E., Ferrari P., and Carroll R. J. 2002. Bias in dietary-report instruments and its implications for nutritional epidemiology. Public Health Nutr. vol. 5, pp. 915--923.Google ScholarCross Ref
- Jobarteh M. L., Mccrory M. A., Lo B., Sun M., Sazonov E., Anderson A. K., Jia W., Maitland K., Qiu J., Steiner-Asiedu M., Higgins J. A., Baranowski T., Olupot-Olupot P., and Frost G. 2020. Development and validation of an objective, passive dietary assessment method for estimating food and nutrient intake in households in lowand middle-income countries: A study protocol. Current Developments in Nutrition. vol. 4, p. nzaa020.Google ScholarCross Ref
- Hochsmann C. and Martin C. K. 2020. Review of the validity and feasibility of image-assisted methods for dietary assessment. Int J Obes (Lond). vol. 44, pp. 2358- 2371.Google ScholarCross Ref
- Gemming L., Utter J., and Ni Mhurchu C. 2015. Image-assisted dietary assessment: A systematic review of the evidence. J Acad Nutr Diet. vol. 115, pp. 64--77.Google ScholarCross Ref
- Stumbo P. J. 2013. New technology in dietary assessment: A review of digital methods in improving food record accuracy. Proc Nutr Soc. vol. 72, pp. 70--76.Google ScholarCross Ref
- Steele R. 2015. An overview of the state of the art of automated capture of dietary intake information. Crit Rev Food Sci Nutr. vol. 55, pp. 1929--1938.Google ScholarCross Ref
- Doulah A., Ghosh T., Hossain D., Imtiaz M. H., and Sazonov E. 2021. ?Automatic ingestion monitor version 2" -- a novel wearable device for automatic food intake detection and passive capture of food images. IEEE J. Biomed. Health Inform. vol. 25, pp. 568--576.Google ScholarCross Ref
- Sun M., Burke L. E., Mao Z. H., Chen Y., Chen H. C., Bai Y., Li Y., Li C., and Jia W. 2014. Ebutton: A wearable computer for health monitoring and personal assistance. in Proc. of 51st Annual Design Automation Conference. San Francisco, CA. June 01 - 05, pp. 1--6.Google Scholar
- Sun M., Burke L. E., Baranowski T., Fernstrom J. D., Zhang H., Chen H. C., Bai Y., Li Y., Li C., Yue Y., Li Z., Nie J., Sclabassi R. J., Mao Z. H., and Jia W. 2015. An exploratory study on a chest-worn computer for evaluation of diet, physical activity and lifestyle. J Healthc Eng. vol. 6, pp. 1--22.Google ScholarCross Ref
- Beltran A., Dadabhoy H., Ryan C., Dholakia R., Jia W., Baranowski J., Sun M., and Baranowski T. 2018. Dietary assessment with a wearable camera among children: Feasibility and intercoder reliability. J Acad Nutr Diet. vol. 118, pp. 2144--2153.Google ScholarCross Ref
- Cerminaro C., Sazonov E., Mccrory M. A., Steiner-Asiedu M., Bhaskar V., Gallo S., Laing E., Jia W., Sun M., Baranowski T., Frost G., Lo B., and Anderson A. K. 2022. Feasibility of the automatic ingestion monitor (aim-2) for infant feeding assessment: A pilot study among breast-feeding mothers from ghana. Public Health Nutr. vol. 25, pp. 1--11.Google ScholarCross Ref
- Lo F. P. W., Sun Y., Qiu J., and Lo B. 2020. Image-based food classification and volume estimation for dietary assessment: A review. IEEE J. Biomed. Health Inform. vol. 24, pp. 1926--1939.Google ScholarCross Ref
- Boushey C. J., Spoden M., Zhu F. M., Delp E. J., and Kerr D. A. 2017. New mobile methods for dietary assessment: Review of image-assisted and image-based dietary assessment methods. Proc Nutr Soc. vol. 76, pp. 283--294.Google ScholarCross Ref
- Ho D. K. N., Tseng S. H., Wu M. C., Shih C. K., Atika A. P., Chen Y. C., and Chang J. S. 2020. Validity of image-based dietary assessment methods: A systematic review and meta-analysis. Clin Nutr. vol. 39, pp. 2945--2959.Google ScholarCross Ref
- Amugongo L. M., Kriebitz A., Boch A., and Lutge C. 2022. Mobile computer visionbased applications for food recognition and volume and calorific estimation: A systematic review. Healthcare (Basel). vol. 11, p. 59.Google ScholarCross Ref
- Konstantakopoulos F. S., Georga E. I., and Fotiadis D. I. 2023. A review of imagebased food recognition and volume estimation artificial intelligence systems. IEEE Rev Biomed Eng ( Early Access ).Google Scholar
- Shimoda W. and Yanai K. 2020. Weakly-supervised plate and food region segmentation. in Proc. of 2020 IEEE International Conference on Multimedia and Expo (ICME). 6--10 July 2020, pp. 1--6.Google ScholarCross Ref
- Aslan Sinem, Ciocca Gianluigi, Mazzini Davide, and Schettini Raimondo. 2020. Benchmarking algorithms for food localization and semantic segmentation. Int. J. Mach. Learn. Cybern. vol. 11, pp. 2827--2847.Google ScholarCross Ref
- Horita Daichi, Tanno Ryosuke, Shimoda Wataru, and Yanai Keiji. 2018. Food category transfer with conditional cyclegan and a large-scale food image dataset. in Proc. of the Joint Workshop on Multimedia for Cooking and Eating Activities and Multimedia Assisted Dietary Management. Stockholm, Sweden. July 15, pp. 67--70.Google ScholarDigital Library
- Yang Z., Yu H., Cao S., Xu Q., Yuan D., Zhang H., Jia W., Mao Z. H., and Sun M. 2021. Human-mimetic estimation of food volume from a single-view rgb image using an ai system. Electronics (Basel). vol. 10, p. 1556.Google Scholar
- Jia W., Chen H. C., Yue Y., Li Z., Fernstrom J., Bai Y., Li C., and Sun M. 2014. Accuracy of food portion size estimation from digital pictures acquired by a chestworn camera. Public Health Nutr. vol. 17, pp. 1671--1681.Google ScholarCross Ref
- Chae J., Woo I., Kim S., Maciejewski R., Zhu F., Delp E. J., Boushey C. J., and Ebert D. S. 2011. Volume estimation using food specific shape templates in mobile imagebased dietary assessment. Proc SPIE Int Soc Opt Eng. vol. 7873, p. 78730K.Google Scholar
- Lo F. P., Sun Y., Qiu J., and Lo B. 2018. Food volume estimation based on deep learning view synthesis from a single depth map. Nutrients. vol. 10, p. 2005.Google ScholarCross Ref
- Jia W., Ren Y., Li B., Beatrice B., Que J., Cao S., Wu Z., Mao Z. H., Lo B., Anderson A. K., Frost G., Mccrory M. A., Sazonov E., Steiner-Asiedu M., Baranowski T., Burke L. E., and Sun M. 2022. A novel approach to dining bowl reconstruction for imagebased food volume estimation. Sensors (Basel). vol. 22, p. 1493.Google ScholarCross Ref
- Ma Yi, Soatto Stefano, Koecká Jana, and Sastry S. Shankar. 2004. An invitation to 3-D vision - from images to geometric models: Springer.Google Scholar
- Coleman Thomas F. and Li Yuying. 1996. An interior trust region approach for nonlinear minimization subject to bounds. SIAM Journal on Optimization. vol. 6, pp. 418--445.Google ScholarDigital Library
- Sahin S. and Sumnu S. G. 2005. Physical properties of foods: Springer.Google Scholar
- Jia W., Yue Y., Fernstrom J. D., Yao N., Sclabassi R. J., Fernstrom M. H., and Sun M. 2012. Image based estimation of food volume using circular referents in dietary assessment. J Food Eng. vol. 109, pp. 76--86.Google ScholarCross Ref
Index Terms
- Estimating Amount of Food in a Circular Dining Bowl from a Single Image
Recommendations
Practical food journaling
UbiComp '13 Adjunct: Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publicationLogging dietary intake has been shown to be of benefit to individuals and health researchers, but a practical and objective system for food logging remains elusive despite decades of research. My thesis is that emerging wearable devices such as life-...
Reconnecting with Food through Dining Play
CHI PLAY '20: Proceedings of the Annual Symposium on Computer-Human Interaction in PlayWhile the food production and processing are known to cause major environmental pollution, we as consumers have little awareness of the underlying processes that bring food from farm to plate. This lack of awareness influences our food related decisions ...
Learning to Rank Food Images
Image Analysis and Processing – ICIAP 2019AbstractIn the last decade food understanding has become a very attractive topic. This has implied the growing demand of Computer Vision algorithms for automatic diet assessment to treat or prevent food related diseases. However, the intrinsic variability ...
Comments