At its recent Search On event, Google unveiled a bunch of search-related innovations and ways the company is using artificial intelligence to help people understand the world around them and make Search results better. At the livestream event, Google shared how it’s bringing “the most advanced AI into their products to further their mission to organize the world’s information and make it universally accessible and useful.”

Some of the improvements will soon appear in the search giants' services, including Search, Maps, and Assistant.

Here are the biggest announcements from the event:

1. New AI-powered tools

Google says its new spell-checking tool will help users identify the most poorly spelled search queries. According to Cathy Edwards, VP engineering at Google, 1 in 10 Google search queries are misspelled, therefore, the company has been using its “did you mean” feature with proper spelling suggestions for a long time. By the end of October, the tech giant will roll out an update to this feature, which will be using a new spelling algorithm powered by a neural net with nearly 700 million parameters.

Another new feature is indexing individual passages from web pages. Previously, Google used to index the whole webpage, whereas now the company promises to index individual ones. According to Google, once the new algorithm rolls out next month, it will improve 7% of all queries in different languages.

Besides, Google is also using artificial intelligence to divide searches into subtopics to help get better results. User queries will now be processed using the BERT language comprehension engine, which helps deliver more accurate results in Google searches.

Computer vision and speech recognition will help Google automatically tag and divide videos into parts, which is similar to what Google already offers creators to do manually on YouTube.

Google Turns Android Smartphones Into an Earthquake Detection System
Google rolled out a new feature that provides users with early alerts about a potential earthquake in their area and also turns their Android phones into mini seismometers.

2. New “hum to search” feature

This feature is a lifesaver for those who can’t get rid of the song that is stuck in their head. This tool will now help you figure out what the song is if you simply hum, whistle, or sing that earworm. The machine learning techniques will then try to identify the song.

The new tool is already available in Google Assistant or Google app on both iOS and Android. To use it, simply ask Google “What’s the sing” or tap the “search a song” button and hum the annoying song.

Google says it trains its machine learning models on “a variety of sources, including humans singing, whistling or humming, as well as studio recordings.”

3. Using augmented reality (AR) while learning and shopping

Google Lens can now recognize 15 billion objects, which is 14 billion more than just last year. It is capable of identifying plants, animals, landmarks, and more.

Google Lens can be used to solve quadratic equations or to learn foreign languages. It can translate over 100 languages and has a text-to-speech function.

Using Lens, you can also search the Internet for a product captured in a photo or screenshot. When you tap and hold an image in the Google or Chrome app on Android, Lens will find the same or similar elements.

Google Now Lets You View NASA Artifacts and Ancient Creatures in AR
Ancients animals and artifacts are overlayed into your environment so that you can view them from different angles and even take pictures of them.

4. New live busyness updates on Google Maps

Google Maps will learn how to determine the busyness of supermarkets, beaches, parks, pharmacies, gas stations and laundries. This data will be displayed directly on the map without the need to open an organization card. According to the company, this will help to avoid crowded places during a pandemic.

Also, using the Augmented Reality Live View function, you can get information about the establishments nearby. To do this, you need to click on the building's shortcut to get data on the rating, working hours, popular times and real-time busyness information.

5. Google Duplex is getting smarter

The Duplex personal assistant, introduced in 2018, uses a speech synthesizer and artificial intelligence algorithms to call establishments to reserve seats.

To help people find more accurate local store information online, Duplex technology now automatically calls organizations to update Search and Google Maps information about their opening hours, takeaway and contactless payment options.