Home > Media News > Google introduces a new way to search that combines images and text into one ...

Google introduces a new way to search that combines images and text into one query
11 Apr, 2022 / 02:29 AM / OMNES Media LLC

Source: https://techcrunch.com/

1156 Views

Tech Crunch: Earlier this year, at Google’s I/O annual developer conference, the company introduced a new AI milestone called Multitask Unified Model, or MUM. This technology can simultaneously understand information across a wide range of formats, including text, images and videos, and draw insights and connections between topics, concepts and ideas. Today, Google announced one of the ways it’s planning to put MUM to work in its own products with an update to its Google Lens visual search.

Google Lens is the company’s image recognition technology which lets you use the phone’s camera to perform a variety of tasks, like real-time translation, identifying plants and animals, copying and pasting from photos, finding items similar items to what’s in the camera’s viewfinder, getting help with math problems and much more.

Soon, Google says it will leverage MUM’s capabilities to upgrade Google Lens with the ability to add text to visual searches in order to allow users to ask questions about what they see.

 
In practice, this is how such a feature could work. You could pull up a photo of a shirt you like in Google Search, then tap on the Lens icon and ask Google to find you the same pattern — but on a pair of socks. By typing in something like “socks with this pattern,” you could direct Google to find relevant queries in a way that may have been more difficult to do if you had only used text input alone.

This could be particularly useful for the type of queries that Google today struggles with — where there’s a visual component to what you’re looking for that is either hard to describe using words alone or that could be described in different ways. By combining the image and the words into one query, Google may have a better shot at delivering relevant search results.

In another example, a part of your bike has been broken and you need to search on Google for repair tips. However, you don’t know what the piece is called. Instead of delving into repair manuals, you could point Google Lens at the broken part of your bike, then type in “how to fix.” This could connect you directly with the exact moment in a video that could help.

 

The company sees these AI-driven initiatives as ways to make its products “more helpful” to end-users by enabling new ways to search. By making use of the phone’s camera as part of Search, Google is aiming to stay relevant in a market where many of its core use cases are starting to shift to other properties. For instance, many shopping searches today now start directly on Amazon. And when iPhone users need to do something specific on their phone, they often just turn to Siri, Spotlight, the App Store or a native app to get help. And Apple is developing its own alternative to Google Search as well. You could see the beginnings of this work in the iOS 15 update to Spotlight search, which now directly connects users to the information they need without the need for a Google query.

Google says it’s also putting MUM to work in other ways across Google Search and video searches, the company announced at its Search On live event today.

The Google Lens update will roll out in the months ahead, noting that it still needs to go through “rigorous testing and evaluation,” which is a part of every new AI model that its deploys.