For people struggling with
obesity1, logging calorie counts and other
nutritional2 information at every meal is a proven way to lose weight. The technique does require
consistency3 and accuracy, however, and when it fails, it's usually because people don't have the time to find and record all the information they need. A few years ago, a team of nutritionists from Tufts University who had been experimenting with mobile-phone apps for
recording4 caloric
intake5 approached members of the Spoken Language Systems Group at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL), with the idea of a spoken-language application that would make meal logging even easier.
This week, at the International Conference on
Acoustics6, Speech, and Signal Processing in Shanghai, the MIT researchers are presenting a Web-based prototype of their speech-controlled nutrition-logging system.
With it, the user verbally describes the contents of a meal, and the system
parses7 the description and automatically
retrieves8 the
pertinent9 nutritional data from an online database maintained by the U.S. Department of Agriculture (USDA).
The data is displayed together with images of the corresponding foods and pull-down menus that allow the user to refine their descriptions -- selecting, for instance, precise quantities of food. But those
refinements10 can also be made verbally. A user who begins by saying, "For breakfast, I had a bowl of oatmeal, bananas, and a glass of orange juice" can then make the
amendment11, "I had half a banana," and the system will update the data it displays about bananas while leaving the rest unchanged.
"What [the Tufts nutritionists] have experienced is that the apps that were out there to help people try to log meals tended to be a little tedious, and therefore people didn't keep up with them," says James Glass, a senior research scientist at CSAIL, who leads the Spoken Language Systems Group. "So they were looking for ways that were accurate and easy to
input12 information."