Google held its annual Search On event, with its 2022 showcase bringing with it a few goodies to build on what was announced last year.
If you want to watch Search On 22, you can do so right here. But here’s the TL;DR.
What Google announced at Search On 22
“At Search On today, we showed how advancements in artificial intelligence are enabling us to transform our information products yet again,” Google said. “We’re going far beyond the search box to create search experiences that work more like our minds, and that are as multidimensional as we are as people.”
Google has a vision for a world in which you’ll be able to find exactly what you’re looking for by combining images, sounds, text and speech. The company expects you’ll be able to ask questions, with fewer words — or even none at all — and Google will still understand exactly what you mean.
“We call this making search more natural and intuitive, and we’re on a long-term path to bring this vision to life for people everywhere,” Google added in a Search On 22 blog post.
Updates to Maps
Neighbourhood vibe will allow you to select a neighbourhood and see the most popular spots “come to life”. This feature will rely heavily on Google users to upload photos, videos and reviews – an easy task if Google is right in claiming there are over 20 million contributions to the map each day. Then, Google will place its AI smarts over the top to give you a more “visual-first” Maps experience.
“Thanks to advancements in computer vision and predictive models, we’re completely reimagining what a map can be. This means you’ll see our 2D map evolve into a multi-dimensional view of the real world, one that allows you to experience a place as if you are there,” Google said.
It will start rolling out globally on Android and iOS in the coming months.
Immersive view, meanwhile, combines aerial views with what the weather, traffic and crowds will be like on a given day and time. This feature, unfortunately, is only launching in Los Angeles, London, New York, San Francisco and Tokyo for now.
Earlier this week, Google also updated Maps in Australia to display air quality information, with data provided by state and territory governments around the country.
Changes to Search
Key to this batch of announcements is Google’s idea that a camera is “a powerful way to access information and understand the world around you”. First Search On 22 announcement for Search was Multisearch.
Multisearch (already a feature) allows you to search text and images at the same time, similar to the way you might point to something in a store and ask someone to tell you about the jacket. In May, Google added Multisearch Near Me, which allows you to search by asking Google what you see with the keyphrase “near me” to find things such as local restaurants and retailers. Today, it’s moved out of beta and is in the hands of those who use English. Support for 70 other languages will be rolled out in the coming months.
Find the perfect dish will use Lens in the Google app to search a screenshot of an item to better identify it. In this example, food. Those in the U.S. can add ‘near me’ to their search (thanks to Multisearch) to find something near them.
Google also announced that people in the U.S. will be able to use Google’s new Results About You feature, which aims to provide a simpler way for people to get their sensitive personal information out of the company’s search results. Next year, Results About You will become proactive and allow users to opt-in to alerts when new personal information related to them appears in search results, enabling users to request removal more quickly. This would be super helpful for those caught up in the Optus data breach.