Google Lens has released features that allow users to see extra information about a restaurant’s menu. In addition, Lens has also enabled improvements in live translations through the mobile camera.
“By focusing the menu of the restaurant, Google Lens will show users what the most popular dishes are, information about the food, etc.”
The functions, available on iOS and Android devices compatible with ARCore, were announced at the beginning of this month in Google I / O, the Google developer conference, and now they are beginning to reach users.
Users can access Lens through the Google assistant, Google Photos or the company’s search engine. On the other hand, Pixel mobile users will have Google Lens automatically incorporated into their camera.
With the new function of Restaurant, users can focus with the camera on the menu of the restaurant and Google Lens will show you which are the most popular dishes, some information about the food shown and even restaurant photos included in your Google Maps profile.
In addition, Lens will now incorporate a lighter version of the Google Translate feature to translate live images and posters.
Now, by using only one of the aforementioned mechanisms, Lens will allow users to translate all kinds of written texts directly from the mobile camera.
The incorporation of both functions in the Lens interface facilitates the use by users, who will no longer have to resort to various Google platforms to do these actions.
In addition, the function of translation linked to that of restaurants, make Lens an ideal tool to be used for tourism purposes.
The launch of these functions, just weeks before the last I / O, is surprising given that Google usually takes longer to update the Lens platform once it announces changes in its conferences.