Abstract:
While the task of automatically detecting eating events has been examined in prior work using
various wearable devices, the use of smartphones as standalone devices to infer eating events remains an open
issue. This paper proposes a framework that infers eating vs. non-eating events from passive smartphone
sensing and evaluates it on a dataset of 58 college students. First, we show that time of the day and features
from modalities such as screen usage, accelerometer, app usage, and location are indicative of eating and
non-eating events. Then, we show that eating events can be inferred with an AUROC (area under the receiver
operating characteristics curve) of 0.65 using subject-independent machine learning models, which can
be further improved up to 0.81 for subject-dependent and 0.81 for hybrid models using personalization
techniques. Moreover, we show that users have different behavioral and contextual routines around eating
episodes requiring speci c feature groups to train fully personalized models. These ndings are of potential
value for future mobile food diary apps that are context-aware by enabling scalable sensing-based eating
studies using only smartphones; detecting under-reported eating events, thus increasing data quality in self
report-based studies; providing functionality to track food consumption and generate reminders for on-time
collection of food diaries; and supporting mobile interventions towards healthy eating practices.
Citation:
Bangamuarachchi, W., Chamantha, A., Meegahapola, L., Ruiz-Correa, S., Perera, I., & Gatica-Perez, D. (2022). Sensing eating events in context: A smartphone-only approach. IEEE Access, 10, 61249–61264. https://doi.org/10.1109/ACCESS.2022.3179702