Google on Tuesday launched five new Lens filters for subjects ranging from food to foreign languages.
The newfeatures will provide better and faster overlays of information over real-world objects, the company said. Google gave CNET’s Scott Stein an early look at what Lens filters can do just ahead of its I/O developer conference earlier this month.
The Dining filter will automatically highlight popular dishes on a menu, tapping into Google Maps to see photos and reviews of specific dishes.
“And when you’re done with your meal, just point the camera at your receipt, and Lens can help calculate the tip and split the bill,” Google said.
With the Translate filter, Google Lens will detect a language and overlay a translation on top of the words. It works across over 100 languages, according to Google.
The Text filter allows users to copy and paste text from objects including Wi-Fi passwords, gift card codes and recipes onto their phone; and Shopping provides similar items when the camera is pointing at clothing, furniture, or home decor, as well as barcode scanning capabilities.
Auto provides search results based on whatever object a user is pointing the camera at.
“We’re taking Google Lens and taking it from, ‘oh, it’s an identification tool, what’s this, show me things like this,’ to an AR browser, meaning you can actually superimpose information right on the camera,” Aparna Chennapragada, vice president and general manager for camera and AR products at Google, told CNET earlier in May.
“One of the questions we had was, if we can teach the camera to read, can we use the camera to help people read?” she added regarding the Lens Translate feature. “This is obviously useful in cases where you’re in a foreign city and you can’t speak the language, but in many parts of the world, people can’t speak or read their own language.”