[ad_1]
Google has launched a series of announcements, all with the goal of improving its main product: search. At its Search On event, the company made it clear that it will focus on using AI to help its users. New features include a new algorithm to better handle spelling errors in user queries, the ability to detect hummed songs, and new tools to help students with homework.
The company also announced updates to Google Lens and other search-related tools.
Mipsellings’ may not matter
The Google event, which was broadcast live, was “to share several new advancements in search ranking, made possible by our latest research in artificial intelligence.”
One of the highlights was the Google search which improved its ability to handle typos. Google said that 1 in 10 queries is misspelled. Even though it is currently addressed with his ‘did you mean?’ function, Google is improving it. “Today, we are introducing a new spelling algorithm that uses a deep neural network to significantly improve our ability to decipher spelling errors. In fact, this single change improves spelling more than all of our improvements in the past five years.” Prabhakar Raghavan, Google’s SVP of Search and Assistant, Geo, Ads, Commerce, Payments and NBU, said.
The new spelling algorithm helps Google understand the context of misspelled words, so it can help users find the correct results, all in less than 3 milliseconds.
Google expects its new technology to improve 7 percent of search queries in all languages as it rolls out globally.
Important data now available
Google is also integrating and making available various data sources previously only available as part of Google’s Open Data Commons, in Search.
“Now when you ask a question like ‘how many people work in Chicago,’ we use natural language processing to map your search to a specific set of the billions of data points in Data Commons to provide the correct statistics in a way visual, easy to understand format. You will also find other relevant context and data points, such as statistics from other cities, to help you easily explore the topic in more depth, “said Google
Key moments in video content
As the world is increasingly embracing video content, Google has created advanced computer recognition and voice recognition to tag key moments in videos.
“With a new AI-powered approach, we can now understand the deep semantics of a video and automatically identify key moments. This allows us to tag those moments in the video, so you can navigate through them like chapters in a book. Yes you’re looking for that step in a recipe tutorial, or the game-winning home run on a reel of highlights, you can easily find those moments. We began testing this technology this year, and by the end of 2020, we expect 10 percent of Google searches will use this new technology. “
Hum and get it
Google said it can now detect voice hums and hisses to show which song a user wants to search for. In Google search, users can tap on the microphone icon and say “What song is this?” or click “Find a song” and start humming for 10-15 seconds. In the Google Assistant, users can say, “Ok Google, what song is this?” and then hum the melody.
Lens lends itself to multiple uses
Another cool thing is in Google Lens, where you can ask the app to read a passage from a photo from a book, regardless of the language. With Google Lens, users can click on any image they find and it will in turn throw up similar items and suggest ways to design outfits.
With the Google Lens feature, users can take a photo of a homework problem and it will convert the image of a homework question into a search query. The results will show how to solve the problem step by step.
Maps have more on them
On the shopping front, if you’re looking for, say, a car, Google will now be able to provide an AR view so you can see what it looks like instead. “Social distancing has also dramatically changed the way we shop, making it easy to visually shop for what you’re looking for online, whether you’re looking for a sweater or want to get a closer look at a new car, but can’t.” “Visit a showroom,” Google said.
On Google Maps, Google will display live information directly on the map, so you don’t have to search specifically for it. “We are also adding COVID-19 safety information to the front and center of the business profiles in Google Search and Maps. This will help you know if a business requires you to wear a mask, if you need to make an advance reservation or if the staff are taking additional safety precautions, like temperature checks. And we’ve used our Duplex conversation technology to help local businesses keep their information up-to-date online, like store opening hours and inventory, “said Google. .
For journalists too
As part of the Journalist Studio, Google is launching Pinpoint, a new tool that brings the power of Google Search to journalists. “Pinpoint helps reporters quickly examine hundreds of thousands of documents by automatically identifying and organizing the most frequently mentioned individuals, organizations and locations. Reporters can sign up to request access to Pinpoint starting this week,” Google said.
Source: Google