A new Google feature that will allow users to search using both images and text combined in to find local retailers who offer the apparel, home goods or the food you’re looking for will soon roll out to users in the U.S., Google announced today at its “Search On” event. The company had first previewed this feature at its Google I/O developer conference this May, signaling a development that seemed to be built in a future where AR glasses could be used to kick off searches.
The capability builds on Google’s A.I.-powered “multisearch” feature introduced in April, which let users combine a photo and text to craft custom searches, initially around shopping for apparel. For instance, you could search Google using a photo of a dress but then type in the word “green” to limit search results to just those where the dress was available in that specific color.
Multisearch Near Me, meanwhile, expanded this functionality even further, as it could then point the user to a local retailer that had the green dress in stock. It could also be used to locate other types of items, like home goods, hardware, shoes, or even a favorite dish at a local restaurant.
“This new way of searching is really about helping you connect with local businesses, whether you’re looking to support your local neighborhood shop or you just need something right away can’t wait for the shipping,” said Cathy Edwards, VP and GM of Search at Google.
At Google’s developer conference, the company had previewed how the feature would work, as users could leverage their phone’s camera or upload an image to begin this different type of search query. The company also demonstrated how a user could one day pan their camera around the scene in front of them to learn more about the objects in front of them — a feature that would make for a compelling addition to AR glasses, some speculated.
However, this feature itself was not yet available to users at the time — it was just a preview.
Today, Google says Multisearch Near Me is going to roll out to U.S. users in the English language “this fall.” It didn’t give an exact launch date.
Plus, the multisearch feature itself (without the local component) will also expand to support over 70 languages in the next few months.
Google to launch its image and text-based ‘Multisearch Near Me’ local search feature in the U.S. by Sarah Perez originally published on TechCrunch