At their 2022 Search On event, Google announced new improvements to their Search products powered by AI advances.
The company says they aim to allow people to find exactly what they are looking for by combining images, sounds, text and speech, meaning users will be able to ask questions, with fewer words — or even none at all — and Google will still understand exactly what they mean.
They highlighted three new features at Search On.
Making visual search work more naturally
Google is making visual search more natural with multisearch, a completely new way to search using images and text simultaneously, similar to how you might point at something and ask a friend a question about it. They introduced multisearch earlier this year as a beta in the U.S., and at Search On, announced they were expanding it to more than 70 languages in the coming months. Google is taking this capability even further with “multisearch near me,” enabling you to take a picture of an unfamiliar item, such as a dish or plant, then find it at a local place nearby, like a restaurant or gardening shop. They will start rolling “multisearch near me” out in English in the U.S. this fall.

Multisearch enables a completely new way to search using images and text simultaneously.
Translating the world around you
One of the most powerful aspects of visual understanding is its ability to break down language barriers.
Google is now able to blend translated text into the background image thanks to a machine learning technology called Generative Adversarial Networks (GANs). So if you point your camera at a magazine in another language, for example, you’ll now see translated text realistically overlaid onto the pictures underneath.

With the new Lens translation update, you’ll now see translated text realistically overlaid onto the pictures underneath.
Exploring the world with immersive view
Just as live traffic in navigation made Google Maps dramatically more helpful, Google is making another significant advancement in mapping by bringing helpful insights — like weather and how busy a place is — to life with immersive view in Google Maps. With this new experience, you can get a feel for a place before you even step foot inside, so you can confidently decide when and where to go.
Say you’re interested in meeting a friend at a restaurant. You can zoom into the neighbourhood and restaurant to get a feel for what it might be like at the date and time you plan to meet up, visualizing things like the weather and learning how busy it might be. By fusing their advanced imagery of the world with their predictive models, they can give you a feel for what a place will be like tomorrow, next week, or even next month. Google is expanding the first iteration of this with aerial views of 250 landmarks right now, and immersive view will come to five major cities in the coming months, with more on the way.

Immersive view in Google Maps helps you get a feel for a place before you even visit.
These announcements, along with many others introduced at Search On, are just the start of how Google is transforming their products to help you go beyond the traditional search box.