KitchenScale: Learning to predict ingredient quantities from recipe contexts

Donghee Choi, Mogan Gim, Samy Badreddine, Hajung Kim, Donghyeon Park, Jaewoo Kang

Research output: Contribution to journalArticlepeer-review

6 Citations (Scopus)

Abstract

Determining proper quantities for ingredients is an essential part of cooking practice from the perspective of enriching tastiness and promoting healthiness. We introduce KitchenScale, a fine-tuned Pre-trained Language Model (PLM) that predicts a target ingredient's quantity and measurement unit given its recipe context. To effectively train our KitchenScale model, we formulate an ingredient quantity prediction task that consists of three sub-tasks which are ingredient measurement type classification, unit classification, and quantity regression task. Furthermore, we utilized transfer learning of cooking knowledge from recipe texts to PLMs. We adopted the Discrete Latent Exponent (DExp) method to cope with high variance of numerical scales in recipe corpora. Experiments with our newly constructed dataset and recommendation examples demonstrate KitchenScale's understanding of various recipe contexts and generalizability in predicting ingredient quantities. We implemented a web application for KitchenScale to demonstrate its functionality in recommending ingredient quantities expressed in numerals (e.g., 2) with units (e.g., ounce).

Original languageEnglish
Article number120041
JournalExpert Systems With Applications
Volume224
DOIs
Publication statusPublished - 2023 Aug 15

Bibliographical note

Publisher Copyright:
© 2023 Elsevier Ltd

Keywords

  • Cooking knowledge
  • Food computing
  • Food measurement
  • Ingredient quantity prediction
  • Pre-trained language models
  • Representation learning

ASJC Scopus subject areas

  • General Engineering
  • Computer Science Applications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'KitchenScale: Learning to predict ingredient quantities from recipe contexts'. Together they form a unique fingerprint.

Cite this