What’s new in Google Lens

July 14, 2021
What’s new in Google Lens

Everyone knows that Google is great at searching the global web. This applies to text queries, but the Google I / O Developers Conference introduced the ability to search not only text, but also various objects when camera assistance. Let’s find out exactly what opportunities have appeared.

Integration of Augmented Reality with Google Maps

Sometimes, especially in unfamiliar cities, it can be difficult to tell which way you are looking, even if Google maps can pinpoint your location. Soon it will be possible to activate Google Maps in the camera app and see useful contextual information.

She will tell you where you are in the real world. You need to point the camera lens to where you are looking, and you will see information about various companies around you, street names, etc. This will allow you to navigate, discover new restaurants that you would not otherwise have noticed, and other objects.

Smart text selection

If you are a schoolboy or student, you would like to be able to copy text from books and paste into your homework as quotes. Modern learners can do this. One of the best new features Google Lens allows you to select and manipulate text by simply pointing the camera at it like a mouse on a monitor. If there are no problems with hyphenation and indentation, the text can be simply inserted into the message or document.

If everything works as shown, it will save a lot of time. For example, it will be possible to use the ability to copy recipes from a cookbook, then send them via text messages to friends or relatives who went to the store. Or you can read the menu in a restaurant in French, and then point your smartphone at the name of a dish, get a description and a list of ingredients.

Matching styles

If you follow fashion or interior design, you know how difficult it is to find matching accessories. With the Style Match function, the problem can be solved.

You need to direct the camera at the subject and assumptions about other similar subjects will appear. In one of the examples shown on the scene, a search was made for a lamp, Google images showed different options with similar bases and their prices. They looked pretty similar.

Last year, Google introduced a similar feature to conventional image search, but showing results in real time through a camera is more useful.

Live Search Results

Google Lens now works in such a way that you need to specifically select an item on the screen and wait for the search results to appear. The updated Lens will provide real-time search-related results. You can stand next to a building and find out its history or height, then move the camera to another building and get information. This will make travel more informative.

Thanks to improvements in machine learning technology, you can point the camera at the poster and Google will start showing the music video of that artist from it.

Support for third party camera applications

At first, Google Lens only worked on Pixel smartphones, but this year support has been expanded to include the Photos app and Google Assistant. Owners of third-party Android smartphones will soon be able to work with Lens inside the camera.

These will be cameras from Sony, Nokia, LG, Xiaomi, OnePlus, BQ and Asus. Samsung and Huawei use their own artificial intelligence in cameras. I would like to believe that in all the quality of work will be no worse than that of Pixel smartphones. Google Lens is an incredible service that will improve your mobile experience when you share it.