New Advancements in Search – Search On 2020.


Notice: Undefined variable: post_id in /home/foundco/public_html/wp-content/themes/pivot-child/inc/content-post-single-sidebar.php on line 48

“Search will never be a solved problem; we will never stop on the quest to perfect search.”

At Search On, Cathy Edwards outlined there are four incoming updates to how Google’s information retrieval systems work, which will directly impact how search happens and the results which are served. All of which are expecting enhancements and updates to how they are powered within the next month. 

1. Did You Mean.

‘Did you mean’ is the technology that understands typos and common errors in search and is thereby able to work out what the user really meant, and serve a result correct to that query. This is technology which has been built upon detecting mistakes; that means tracking what a user types the first time, when they then refine it and mapping the differences within their queries. 

Whilst no specifics were detailed, Cathy outlined that this is an area that has been heavily invested in and they’ve made breakthroughs in improving, showing examples where the original query barely matched what the user really meant and the mistyping was based on key proximity rather than phonetic spelling. Currently 1 in 10 queries involve some form of misspelling so improving the way Google processes this will have a huge impact upon how search works. 

2. Specific Searches.

Often a user searches for a very niche question or they’re looking for an exact answer to a problem they’re facing. Cathy outlined a breakthrough in how Google is indexing the answers to these which will impact 7% of queries from next month. At present, Google indexes web pages which can provide answers to this type of query, but they’re shifting to also index the individual passages – so the specific answer as well as the context around it. We’d expect it’ll impact ‘utility’ searches where there is a useful, simple answer. 

Traditional search where full web pages are returned involve information retrieval where Google looks down into its index and sources the most valuable answer. Whilst this is still a quick result, it involves heavy-lifting in terms of how Google processes it and involves the consideration of hundreds of ranking factors to order the content. However, by using block-level indexing, Google can look up into its categorised Knowledge Graph to immediately return the content that helps answer the user question. 

It’s also worth noting the time and support Google has provided to open-source technology such as AMP and supporting Gutenberg frameworks, as this is block-level creation of content that will integrate well with this form of indexation.

3. Broad Searches.

After covering specific searches, Cathy moved on to talk about the advancements when it comes to broad searches. It was implied that whilst users may type a broad query that they often are actually looking for a more specific answer or that their next query will include some more specific refinements for the answer they need. For example if searching to renovate a bathroom, after the broad query they may then want to refine to a related query about renovating their bathroom on a budget. In effect they go from searching a topic, into searching a sub-topic. 

Google has been collecting data and learning about the sub-topics related for years. In fact at their similar event in 2019, they announced Topic Cards as part of how they were linking multiple topics together and understanding the areas within these broader categories. Cathy shared some visuals of a search where then the user clicks down into the particular niche within that search they’re most interested in. Whilst these weren’t true search result visuals, they may be a clue to the next steps we’ll see in how Google pushes related queries and ‘people also searched’ results higher up the journey and the search results pages in future.

4. Video.

Similar to Google’s indexation of content in passages and using AI to group together broad themes, Cathy also announced how AI is doing similar to video content. Previously content creators have been encouraged to tag their content with timestamps of key moments and transcripts which will help Google to process and understand the content within the video. However, they have introduced AI and understanding of natural speech that can automatically tag those key moments to be chapters within the video, and thereby recalled to answer with a specific and broad query as required. 

Whilst we see these ‘fraggles’ at the moment in the SERP where Google flags parts of the video that it would recommend for you to view, that’s previously been based on other user behaviour and the content owners mark-up of the video. This new advancement takes it into Google’s hands. 

For understanding the impact on the future, watch this space. I’d predict it will add a requirement onto video creation that is more strategic for search. The ‘key search moments’ will need to be factored in and highlighted in the language used and style of creation to be picked up most impactfully by the AI.

One for the Future.

The final element of Cathy’s presentation covered an area that isn’t as frequently talked about; optimising search specific for journalists looking for facts. This may have sounded inconsequential but could perhaps unfold to be the biggest update of them all. They announced a new platform specific for journalists called Pinpoint (g.co/pinpoint), to sign up you should work for a news organisation and you can use their data to “find the facts that matter”. This is likely an extension of their investment into The Trust Project and funding journalists to improve the quantity of factual reporting and quality journalism on the internet. 

The skeptical amongst us could also identify that in creating Pinpoint, Google has created the ultimate testing ground for if a fact is correct or not in presenting them directly to the people whose job it is to investigate true stories. When you sign up you must approve that you’re willing to give feedback to Google when your articles involving the facts get published, thereby authenticating that the data used was indeed accurate and allowing for an automated review of the context the data was used within. Over time Pinpoint can support Google’s machine learning of what is and isn’t trustworthy data by asking those responsible for truthful reporting to share the information with it. Whilst there’ll be no immediate impact on search for us, this is definitely one to keep an eye on! 

Check out the rest of our Search On roundup’s here:

  1. Delivering a Higher Standard of Google Search (talk 1)
  2. 3 New Enhanced Google Features (talk 3)