Earlier this month, Google hosted its annual Google I/O event for this year. The search engine giant revealed a bunch of new features and improvements. The global brand, Google also rolled out a new feature named “MultiSearch Near Me”. This feature will allow users to search for anything using text and images simultaneously via Google Lens.
What is the Google Multisearch feature?
Google Lens is the company’s image recognition technology. Last year in September, Google teased this functionality in its search at the event. The multisearch functionality is currently available in the Beta version, and it is available in English in the United States. Also Read – Google provides COVID-19 vaccine information in Search App
Google has always been the go-to search engine for billions of users across the globe. These days many search queries have been done through the Google Lens, and let’s check out how basically this works.
How to Google Search an image using Google Lens?
We can access the Google multisearch functionality while searching for a product using the Google Lens. While accessing the Google Lens, we can add “near me” to the search. With this “near me” option, we can find the results near us. The Google Multisearch “Near Me” functionality will work on items like food, apparel, and others. Also Read – Xiaomi 12 Pro with Snapdragon 8 Gen 1 launched in India; first impressions
FYI, Google records that users should have the latest versions of the app to access the new functionality. With the new multisearch feature, we can also ask questions about the object before us. Furthermore, we can also refine the search results by color and brand or visual attributes.
Multisearch feature on Google Lens
To get started,
- Open Google App on Android or iOS
- Tap on the Camera Lens icon
- Else, search for one of the screenshots or take a photo.
- Swipe up and tap on the “+ Add to your search” button to add text along with Google images
How Google Multisearch works?
During the Google I/O 2022 event, the search engine giant claims that the new feature has the best results for shopping searches. The tech giant believes that the search cases will increase in the future. With the initial beta launch, we can also do things beyond shopping. However, it won’t be perfect for every search.
The new functionality could be useful for the queries that Google currently has trouble with. The Google multisearch functionality is useful where a visual component of an object is hard to describe in words. By combining the image and the words into one query, Google is trying to offer a relevant search result experience to the users.
In a recent Google blog post, the search engine giant announced, “At Google, we’re always dreaming up new ways to help you uncover the information you’re looking for — no matter how tricky it might be to express what you need,”. In the blog post, Google further adds, “That’s why we’re introducing an entirely new way to search: using text and images simultaneously. With the multisearch on Google Lens, you can go beyond the search box and ask questions about what you see.”
Google Multisearch Idea
The search engine giant reveals that the new functionality is made possible by its latest advancements in artificial intelligence. Furthermore, the tech giant is also exploring more ways MUM could enhance multisearch. Also Read – Instagram Ads in India
What is MUM?
This MUM is the latest AI model in search, which comes with the Multitask Unified Model. It can simultaneously understand information across various formats, including text, images, videos, and drawings. Additionally, it can create connections between topics, concepts, and ideas. Also Read – Google Photos to simplify image search with enhanced filters
Along with Google Multisearch Near me, the tech giant also brought Scene Exploration by offering a more open search experience. With updated Scene exploration on the Google Lens, we can easily search multiple items and get their details by panning over all the items.
According to Google, when we pan over all the items in front of us, it will take multiple frames and can create a scene. The Google Multisearch Near me feature will be released later this year globally. Currently, it is available in the English language in the United States. Also Read – New Contacts search filter features on WhatsApp Messenger application
Diversified Search Filter
Along with the Google Multisearch Near me feature, the tech giant also updated its search filters for images. While searching some specific terms, we can get a filter for choosing the skin tones in Google Images. Here, this skin tone filter is designed by Dr. Monk’s skin tone scale, which is said to be more accurate. Also Read – Samsung Galaxy Z Fold 3, Galaxy Z Flip 3 price in India announced
With this skin tone filter, the users can get relevant images depending on their skin tones. Additionally, the biggest search engine will soon show such images with a schema labeled by creators, brands, and publishers to show attributes like skin color, textures, and other details shown in the images.
Quick Phrase Search on Google Assistant
With the initial rollout, “Ok Google” to Google Assistant seems more practical. But now, most users are tired of saying, “Ok Google”. With the launch of the Quick Phrase Search feature, the Google Assistant will work more quickly, and this new feature will let us search the website more humanly. Also Read – Samsung Galaxy Z Flip 3, Samsung Galaxy Z Fold 3 design leaks
Instead of saying the phrases like “Ok Google” or “Hey Google”, we can directly instruct the Google Assistant. For instance, we can simply say, “What’s the weather outside?” Instead of saying, “Hey Google, What’s the weather outside?”. Additionally, we don’t need to be monotonous while asking queries to the Google Assistant. Even if we stumble while speaking, the assistant will pick our words and show the best possible results.
Personalized Ads in Search Results
After the Multisearch Near me feature, the Personalized Ads is one of the most interesting additions coming on the Google search. Now, instead of seeing any random, unfiltered advertisements, we can choose what ads we want to see by setting up the category of our choice. Also Read – Google is testing a dark mode for desktop search
Furthermore, we can also choose to see fewer ads while searching and browsing online. Moreover, the largest search engine giant will also allow us to take control of our data on the web. For instance, if we post our email or phone number and it appears in the search results.
We can easily take it down by a removal request with the new update. It is worth noting that it will only disappear from the search results and not be entirely removed from Google. The feature is currently available in Beta and will be officially available later this year in the global market. Also Read – Apple Mac with M2 processor; nine new variants under testing
5 thoughts on “What is Google Multisearch near me, Diversified Filter, Personalized Ads?”