Intuitively estimating the healthiness of meals from their images: image-based meal rating system to encourage self-management of diabetes

2018 
Self-management of meals is a key component of treating diabetes. There are systems that support meal planning by processing the images of meals. Most such systems output the estimated nutrition values of the individual dishes, however, a more intuitive and useful approach is to give patients the motivation to become interested in their meals. Unfortunately, existing systems require the patient to manually input full details of the dishes, or to manually correct the roughly and automatically recognized results output by the systems, which deters the patient from using the system. In this paper, we present a system that predicts the healthiness of the meal and activate the patient's interest in self-management. Unlike conventional image-based methods, which estimate the nutrition via food-category recognition, our proposal predicts if the meal is healthy or not healthy by visually processing all the dishes in the meal. Our system consists of a convolutional neural network(CNN), that is trained by joint classification and ranking loss to replicate the judgement of registered dietitians; the goal is to output predictions of "which meal is healthier". Experiments show that the proposed system can roughly match the ranking results of registered dietitians. The overall ranking error rate for all images examined is 16.4%. The results show that the system learns how to make intuitive assessments of the rate of healthiness from the images themselves. This implies that meal appearance contains valuable clues as to healthiness. Thus the proposed system can be used to detect patients who need to receive more guidance for improving their eating style.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    17
    References
    0
    Citations
    NaN
    KQI
    []