Five thousand—that is the number of changes Google made to its search engine in 2021. It made 4,500 changes in the year before that. Google is constantly tweaking the algorithm of Google Search to give its users a more natural and intuitive experience. It performs hundreds of thousands of experiments to develop features that keep up with the evolution of human behaviour. Every year, Google hosts an event called Search On where it announces these major changes. This year’s event was held two days ago on the 28th of September.
Prior to the open-to-all event, Google executives, in several interviews, said that search is undergoing a “total reinvention”. With the increasing preference for apps like TikTok for searches and Google’s trend of offering more natural ways to search that go beyond the text box, the announcements at the events were exciting but not surprising. Google does hundreds of thousands of experiments to create features that make searching in Google feel more natural to them. The new developments affect every page and website that pops on Google so whether you are an SEO expert, website owner or searcher, you want to learn what these updates are.
At this year’s event, Search On 22, Google announced that:
Multisearch is expanding
In April, Google introduced a multisearch feature. This feature mimics the way we point at objects and ask questions about them in real life. The multisearch feature allows Google app users to take a photo of an object or a screenshot and add texts of physical attributes like colour to visual queries to narrow the search. Since then, 8 billion google searches were made each month, using a camera or an image. Multisearch turned out to be a successful tool for online shopping. Users would take pictures of outfits and type in a colour like blue to find that outfit in that colour. It was initially available only in the US, but at the Search On event, Google announced that it will be rolling it out globally in 70 languages in the next few months.
Multisearch near me
There’s more. Building on the multisearch feature, Google is taking a leap forward with an upcoming feature called “multisearch near me”. It will start rolling out in America later this year. Using a combo of pictures and texts, Google users can locate local retailers who sell apparel or home goods. You can also search for food using a photo to find restaurants that sell it near you.
Translation will happen in the blink of an eye
In its quest to break down language barriers, Google has gone beyond translating text to translating pictures. Every month, people use Google to translate texts in over 100 languages over 1 billion times. Its image recognition technology has advanced through machine learning to do the reverse—translate text to images such that if you point your phone at text on an image, Lens will translate the text on the poster. Google has announced that the improved experience of this feature will launch later this year. With the new update, Google will overlay the translated text over the pictures underneath in a more realistic way such that the only difference in the image will be the translated text. Using generative adversarial networks (also known as GAN models), Google has also optimised to do all this in just 100 milliseconds—quicker than you can blink your eye.
Quicker ways to find what you’re searching for
Sometimes we do not know what we are searching for until we see it. In the coming months, Google will help you to specify your questions by providing keyword or topic options so you can find results that are most relevant to you. For example, if you’re looking for a holiday destination in South Africa, Google will provide keyword or topic options like “best cities in South Africa for families”.
Soon, when you begin to type a question, Google will start providing relevant content right away, before you’ve even finished typing. This is to help you find answers to your questions with fewer words- or even none at all.
New ways to explore information
Google is tailoring the way it displays search results to the ways we explore topics. Soon Google search will begin to present the most relevant content, from a variety of sources in different formats—text, images or video—including content from creators on the open web.
In the coming months, if you’re searching for topics like food, you may see videos from people who have eaten the food. If you search for a city, you may see images and videos of other people exploring the city, tips on things you need to know before you travel there and things to do when you get there.
New ways to use the map
At Search On, Google shared its transformational plans for Google Maps. Its plans entail new features that will make Google Maps more visual and intuitive in ways that will allow you to experience a place as if you’re there.
One of the features is the “neighbourhood vibe” which allows you to select a neighbourhood and see its most popular spots come to life. AI technology combined with local knowledge from Google Maps users who add more than 20 million reviews, photos and videos to the app daily will make this possible. Neighbourhood vibe starts rolling out globally on Android and iOS in the coming months.
Google is creating a more immersive map with a new feature called immersive view. To that end, it launched over 250 photorealistic aerial views of global landmarks spanning from the Tokyo Tower to the Acropolis. The map will also show helpful information about locations like how busy they will be at certain times of the day, the expected weather conditions, where entrances are and possibly what the inside of a building looks like.
There is also an upcoming Live view feature which overlays arrows and directions right on top of your present location so you don’t get lost. This feature will start rolling out in London, Los Angeles, New York, San Francisco, Paris and Tokyo in the coming months on Android and iOS.
Google for iOS updates
From now on, iOS users will be able to access shortcuts below the search bar. Shortcuts to shop screenshots, translate text with camera, hum to search and more.
Google has these and several other interesting features planned and you can learn more about them by watching the full video of the Search On 22 event or a recap.