Case study Lose It! uses ML Kit to
extract data from nutrition labels
and improve user experience
extract data from nutrition labels
and improve user experience
Since 2008, Lose It! has helped more than 30 million people lose over 50 million pounds. The app helps users manage their diets by making food logging as easy as possible, and their product teams are always looking for new ways to make things even simpler.
The team behind Lose It! first introduced Snap It, an object-recognition algorithm designed to help users log their favorite foods just by taking a photo. But the high computational cost of the algorithm necessitated the use of a GPU server, which meant they weren’t able to make the experience available in real time.
Around the same time, the team also wanted to add a nutrition label-scanning feature, but the server analysis time was only marginally faster than if the user just entered the information themselves. Plus, because variable cell network speeds and signals could make or break the user experience, they weren’t able to guarantee consistent performance.
The Lose It! team knew their users would love both features, but only if they could make them fast enough to offer the experience in real time. So what could they do to speed things up?
What they did
The team turned to ML Kit to solve their speed issues. “ML Kit has proven extremely useful for deploying our food-recognition feature, Snap It, using compressed, quantized TF Lite models,” said Edward W. Lowe, Jr. Ph.D., Director of Data Science and AI at Lose It! Previously, the Snap It algorithm was deployed on a server, necessitating the transfer of a food image to the server for inference. But “a custom model hosted by ML Kit allowed us to seamlessly implement a quantized Snap It model on device, which enables our users to utilize this feature in real time and without a data connection,” Lowe said. “And by leveraging the on-device text recognition API, we were able to significantly reduce the image analysis time for nutrition label reading.”
“Importantly, ML Kit allows us host models in Firebase“, Lowe added. “This enables us to seamlessly update models on device without updating the app, reduces the app size, and allows us to A/B test model versions. Disconnecting model deployment from app release allows us to respond quickly to changing user behavior and to better cope with drift.”
Results
Thanks to ML Kit, Lose It! was able to launch a widely available, high-performance nutrition label reader. Users can now simply scan a nutrition label to instantly fill in the nutrition information for any new food. As for the nutrition label scanner, information is recognized in less than one second in most cases. The user doesn’t even need to take a picture—the app can pull the information right from the camera view in real time.