Recognition of dietary activity events using on-body sensors

https://doi.org/10.1016/j.artmed.2007.11.007Get rights and content

Summary

Objective

An imbalanced diet elevates health risks for many chronic diseases including obesity. Dietary monitoring could contribute vital information to lifestyle coaching and diet management, however, current monitoring solutions are not feasible for a long-term implementation. Towards automatic dietary monitoring, this work targets the continuous recognition of dietary activities using on-body sensors.

Methods

An on-body sensing approach was chosen, based on three core activities during intake: arm movements, chewing and swallowing. In three independent evaluation studies the continuous recognition of activity events was investigated and the precision-recall performance analysed. An event recognition procedure was deployed, that addresses multiple challenges of continuous activity recognition, including the dynamic adaptability for variable-length activities and flexible deployment by supporting one to many independent classes. The approach uses a sensitive activity event search followed by a selective refinement of the detection using different information fusion schemes. The method is simple and modular in design and implementation.

Results

The recognition procedure was successfully adapted to the investigated dietary activities. Four intake gesture categories from arm movements and two food groups from chewing cycle sounds were detected and identified with a recall of 80–90% and a precision of 50– 64%. The detection of individual swallows resulted in 68% recall and 20% precision. Sample-accurate recognition rates were 79% for movements, 86% for chewing and 70% for swallowing.

Conclusions

Body movements and chewing sounds can be accurately identified using on-body sensors, demonstrating the feasibility of on-body dietary monitoring. Further investigations are needed to improve the swallowing spotting performance.

Introduction

Daily dieting behaviour strongly influences the risk for developing disease conditions. The most prevalent disease associated to an imbalanced diet is obesity. Current estimations account for over one billion of overweight and 400 million obese patients worldwide. This still increasing trend was attributed to the rapid changes in society and behavioural patterns in the last decades [1]. However, obesity is not a unique diet-related disease that decreases healthy life-years in many populations. Rather, it surges the risk for related diseases, including diabetes mellitus, different types of cancer and cardio-vascular diseases. Often the diseases confound or overlay each other, preventing accurate accounting.

Several key risk factors have been identified, that are controlled by dieting behaviour. These include the timing of food intake and integration into daily schedule. For example, intermediate snacking was found to add a major part to the daily energy intake [2]. Another critical aspect is the food selection. High-energy food can be replaced by lower energy densities, such as fruits and vegetables. This improves the diet quality and lowers body weight [3].

Minimising individual risk factors is a preventive approach to systematically fight the origin of diet-related diseases. It is the most promising solution for improving quality of life in the future. Since nutrition is an inherent part of daily activities, the adoption of a healthy diet requires individual lifestyle changes. These changes need to be implemented and maintained over periods of months and years. For this purpose, a convenient long-term monitoring of dietary behaviour could become a vital tool to assess eating disorders and support diet modifications through feedback and coaching.

No single-sensor solution exist that could capture the process of food intake and is simple to implement for diet management. Currently, dietary activities are studied manually by entering the information into food intake questionnaires. Mobile devices and Internet appliances are used to support the information entry, e.g. by taking pictures of the food [4] and estimating calories from entered data [5]. Further approaches to simplify data entry include the scanning of shopping receipts [6] as well as bar codes or recording voice logs [7].

These manual acquisition methods require a considerable effort of study participants, primarily to remember entering the information into the questionnaire, and study managers, to verify and analyse the data. Typically, this method is prone to errors such as imprecise timing due to back-filling, missing food item details, e.g. when using voice recordings [7] and low user compliance, especially for paper-based diaries [8].

Many dietary parameters such as the rate of intake (in g/s) or the number of chews for a food piece are rarely assessed because adequate sensing facilities are only available in laboratory settings. However, these parameters are related to palatability, satiety and speed of eating [9]. Behavioural investigations have utilised weighting tables in controlled settings to measure the amount and rate of food intake during the consumption of individual meals [10]. An oral implant sensors was developed to acquire information about these parameters [11]. However, these techniques certainly influences the user’s behaviour and are not feasible for long-term monitoring.

All noninvasive dietary monitoring techniques suffer from estimation errors regarding the exact amount and calories of every consumed food item. However, a rough estimation for relevant parameters such as ratio of fluid and solid foods, food category and timing information, such as eating schedule and meal intake durations over the day, will provide a solid basis for behavioural coaching. We believe that much of this information can be extracted from on-body sensors.

In this work, we evaluate on-body sensing methods to automatically monitor dietary intake behaviour. In particular, three core aspects of dietary activity (sensing domains) were investigated by on-body sensors:

  • (1)

    Characteristic arm and trunk movements associated with the intake of foods, using inertial sensors.

  • (2)

    Chewing of foods, monitored by recording the food breakdown sound with an ear microphone.

  • (3)

    Swallowing activity, acquired by a sensor-collar containing surface Electromyography (EMG) electrodes and a stethoscope microphone.

We derive pattern models for specific activity events using the sensor data of each domain and analyse the event recognition performance. For example, individual chews are considered as events in the domain chewing. In particular, the paper makes the following contributions:

  • (1)

    We present a flexible event spotting method that can be applied either to an individual sensing modality or a combination of several. The approach obtains its adaptivity from a variable-length feature pattern search. Its selective power originates from competitive and supportive fusion of event spottings with largely independent sources of errors. We summarise the domain-specific adaptations of the procedure. The pattern description is achieved by using time and frequency-domain features that model the temporal characteristics of an event. Using this approach, more complex algorithms, like hidden Markov models (HMMs) were avoided.

  • (2)

    We analyse the recognition of individual arm movements as well as chewing and swallowing activities from the intake of different food items. For each domain, we describe the activity sensing approach, the domain-specific recognition constraints and the conducted case studies to obtain naturalistic evaluation data. Since our work targets a combined detection and classification of the activity events, we present quantitative results for both, indicating a good performance and the feasibility of the sensing approaches for automatic dietary monitoring.

The evaluations are performed on data from three different studies. To analyse the recognition performance under realistic conditions, the data sets included other common activities, e.g. conversations and arbitrary movements.

Section snippets

Dietary activity domains and related work

Activity monitoring and recognition has attracted researchers from many backgrounds, including machine vision and more recently pervasive and wearable computing. An exhaustive review of the literature is beyond the scope of this work. Instead, we focus on systems for behaviour and automatic dietary monitoring as well as research on the three sensing domains considered in this work.

Approaches towards automatic dietary monitoring typically build on intelligent infrastructures. Chang et al. [12]

Recognition and evaluation methods

The envisioned system shall be continuously worn during daily routine. In all sensing domains relevant activity events occur only sporadically, often embedded into a large set of other, non-relevant activities (NULL class). For example, stethoscope-like sound recordings intended to record swallowing sounds at the throat, inherently pick up speaking, or even environmental noises.

A method that targets the spotting of relevant activity events should be effective in retrieving correct events while

Study description

To evaluate our recognition approach for movements, a case series was recorded, utilising commercially available inertial sensors. Table 1 specifies the sensors used. The inertial sensors were attached onto a jacket at the lower and upper arm as well as the upper back. Fig. 3 illustrates the sensor positions.

The movements of the arms and upper body was recorded with a sampling rate of 100 Hz from four right-handed volunteers (1 female, 3 male, aged between 25 and 35 years). The participants were

Study description

For the evaluation of chewing sounds we used an ear microphone as indicated in Fig. 3. The miniature microphone was build into a standard type ear pad and kept at the ear canal by an ear hook, as it is used for mobile phone headsets. In a single case study the chewing sounds from different foods were recorded at 16 bit, 44 kHz from a male individual with natural dentition (aged 29 years).

The participant was seated conveniently on a chair close to a table carrying the foods. He could still hear

Study description

Swallowing was analysed from surface EMG electrodes and a microphone sensor. The sensor positioning was equal for all participants. For some participants the sensors were embedded in a collar. The collar helped to quickly attach the sensors to the correct throat region. The location of the EMG was constantly verified, however, the collar supported the stable positioning at the infra-hyoid position very well. The microphone was situated at the lower part of the throat, below the larynx. EMG was

Methodology

The continuous recognition of dietary activity events from sensor data patterns was evaluated in this work. Spotting activity events in continuous sensor data is a vital prerequisite for the deployment of activity detection in general. While the targeted activities can be described by a domain expert, the embedding data (NULL class) cannot be modelled due to the degrees of freedom in the human activities and the cost for large training data sets. Consequently, assumptions about the embedding

Conclusion

We presented novel approaches to monitor dietary activities from body-worn sensors. Three sensing domains were analysed, that are directly linked to the sequence of dietary activities: intake movements, chewing, and swallowing. We presented evaluation results from studies in each domain using an event recognition procedure, that supports the detection and identification of specific activities in continuous sensor data.

The recognition of natural movements, such as for dietary intake, is a

Acknowledgements

The authors express their gratitude to all volunteers who participated in the studies related to this publication and to all reviewers for their very helpful comments. This work was supported by the Swiss State Secretariat for Education and Research (SER).

References (42)

  • MyFoodPhone, World’s first camera-phone & web-based-video nutrition service. Internet: accessed on August 2007 (Feb...
  • J. Beidler et al.

    The PNA project

    J Comput Sci Colleges

    (2001)
  • Mankoff J, Hsieh G, Hung HC, Lee S, Nitao E. Using low-cost sensing to support nutritional awareness. In: Goos G,...
  • Siek KA, Connelly KH, Rogers Y, Rohwer P, Lambert D, Welch JL. When do we eat? an evaluation of food items input into...
  • A.A. Stone et al.

    Patient non-compliance with paper diaries

    Brit Med J

    (2002)
  • H.R. Kissileff et al.

    Universal eating monitor for continuous recording of solid or liquid consumption in man

    Am J Physiol

    (1980)
  • Chang K-H, Liu S-Y, Chu H-H, Hsu JY, Chen C, Lin T-Y, etal. The diet-aware dining table: observing dietary behaviors...
  • Schmidt A, Strohbach M, van Laerhoven K, Friday A, Gellersen H-W, Context acquisition based on load sensing. In: Goos...
  • Patterson D, Fox D, Kautz H, Philipose M, Fine-grained activity recognition by aggregating abstract object usage. In:...
  • Chambers S, Venkatesh S, West G, Bui H. Hierarchical recognition of intentional human gestures for sports video...
  • Ogris G, Stiefmeier T, Junker H, Lukowicz P, Troster G. Using ultrasonic hand tracking to augment motion analysis based...
  • Cited by (0)

    View full text