DeepRecipes: Exploring Massive Online Recipes and Recovering Food Ingredient Amounts

Ingredient amounts are crucial for food-oriented health systems, but this information is seldom used in food-oriented health systems due to the difficulty of fetching it from online recipes. This study proposes a predictive model named DeepRecipes to extract ingredient amounts from online textual re...

Full description

Bibliographic Details
Main Authors: Kequan Li, Yan Chen, Hongsong Li, Xiangwei Mu, Xuhong Zhang, Xiaozhong Liu
Format: Article
Language:English
Published: IEEE 2021-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9423993/
Description
Summary:Ingredient amounts are crucial for food-oriented health systems, but this information is seldom used in food-oriented health systems due to the difficulty of fetching it from online recipes. This study proposes a predictive model named DeepRecipes to extract ingredient amounts from online textual recipes. The model predicts ingredient amounts according to a given recipe&#x2019;s name and listed ingredients. We train the model on a small set of recipes containing all ingredients and their corresponding amounts. As we can extract the recipe names and ingredients from almost all online recipes, the proposed model can potentially recover ingredient amounts for massive online recipes. We first trained the model on a small set of recipes containing all ingredients and their corresponding amounts. Then, we compared ten models as references for their performances. The performance of DeepRecipes exceeds those of all the comparison models. The model&#x2019;s mean absolute error (MAE) and mean absolute percentage error (MAPE) are <inline-formula> <tex-math notation="LaTeX">$3.96\times {10}^{-1}$ </tex-math></inline-formula> and 18.57&#x0025;, respectively, and its APEs are lower than 50&#x0025; in more than 95&#x0025; of the total predictions. This accuracy is sufficient for providing rough ingredient amount estimations for food-oriented health systems.
ISSN:2169-3536