The app aims to provide a handy way to record nutritional information of each meal. Whilst effective, the method is time consuming and prone to errors when assessing the caloric count.
Researchers from Massachusetts Institute of Technology (MIT) recently presented a web-based prototype of a speech-controlled nutrition-logging system.
The system uses an online database maintained by the US Department of Agriculture (USDA) to retrieve nutritional data that the system recognises as a result of verbal input by the user.
The system combines images along with the data, allowing the user to further refine their description.
MIT and Tufts partnership
“If a user was to describe precise quantities of food, refinements can be made verbally,” said James Glass, a senior research scientist at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), and who also lead the Spoken Language Systems Group.
“A user who begins by saying, “For breakfast, I had a bowl of oatmeal, bananas, and a glass of orange juice” can then make the amendment, “I had half a banana,” and the system will update the data it displays about bananas while leaving the rest unchanged.”
The system is the result of collaboration between MIT researchers and a team of nutritionists from Tufts University. The Tufts team had already been experimenting with mobile-phone apps for recording caloric intake before approaching CSAIL.
“The Tufts nutritionists believed that the apps that were out there to help people try to log meals tended to be a little tedious, and therefore people didn’t keep up with them,” said Glass.
“So they were looking for ways that were accurate and easy to input information.”
The researchers began by focusing on two specific problems. One required the system to recognise that if the user said: “bowl of oatmeal” nutritional information on oatmeal was relevant. However, if the phrase, “oatmeal cookie” was uttered the system had to recognise that nutritional information here was not relevant.
The team turned to the Amazon Mechanical Turk crowdsourcing platform, where they asked volunteers to describe what they’d eaten at recent meals. Relevant words were then labelled in the description as names of foods, quantities, brand names, or modifiers of the food names.
Therefore, in the phrase “bowl of oatmeal”, “bowl” is a quantity and “oatmeal” is a food, but in “oatmeal cookie” oatmeal is a modifier.
When the team labelled about 10,000 meal descriptions, machine-learning algorithms were then bought into play to identify patterns in the relationships between words in order to identify their functional roles.
The second challenge involved matching the user’s phrasing with the entries in the USDA database. One example of this was attempting to match the word “oatmeal” with the USDA’s entry, which was recorded under the heading “oats”.
Here, the researchers used Freebase, an open-source database that had entries on more than 8,000 common food items, including synonyms. Where synonyms were absent, they asked Mechanical Turk workers to supply them.
While this version of the system reports calorie counts it cannot automatically total them. Tufts researchers are currently working on this and plan a future study to determine whether it makes nutrition logging easier.
“I think logging is enormously helpful for many people,” says Susan Roberts, director of the Energy Metabolism Lab at Tufts’ USDA-sponsored Jean Mayer Human Nutrition Research Center on Ageing.
“It makes people more self-aware about the junk they are eating and how little they actually enjoy it, and the shock of huge portions, et cetera. But currently, it is really tedious to log your food.”
“A spoken-language system that you can use with your phone would allow people to log food wherever they are eating it, with less work,” she adds. “As I see it, we need to come up with something that really isn’t much work, so it isn’t an extra burden in life.”