Google is the preferred website for all things searches for approximately 92% of the people on the planet earth. People are looking for answers, for questions they can’t frame or just know nothing about. BERT is the biggest change in Google’s algorithms in the last 5 years. With this update, Google intends to understand our complex long-tail search queries and display relevant search results. Google has been trying to understand human semantic context by using Natural Language Processing, and it has greatly improved the ability to deliver refined and much closer results.
Has this ever happened to you, when you are trying to find the right words to say or remember what you wanted to say but are totally at loss of it, and would get back to Google to find it out? You’d then put random words in Google search feed begging that Google might give you the answer.
Well, Google was hardly equipped to answer long-tail questions, but guess what, with Bidirectional Encoder Representations from Transformers, or as we know it Google’s BERT update, you’ll get exactly what you’re looking for, in spite of the random keywords that you feed in Google.
This technological advancement has happened on a mathematical model called transformers. Now Google will be able to understand the meaning of prepositions in isolation, and also be able to interpret individual words in relation to the preposition.
This Google update went live on 24th Oct 2019 just for the US search results, and yet to be rolled out for the rest of the world, and in this while, experts have suggested that it has and or will affect only 10% percent of the search queries. Danny Sullivan, Google Search Liaison Officer has also clarified that there is no fixed timeline for the release of BERT update in other countries.
Which search queries have been affected by BERT?
BERT update has not affected any search query; it will only help to understand and interpret the search queries better. BERT will now be able to analyse and understand the meaning of long-tail search queries like ‘which is the nearest metro station to lotus temple if I want to get down at yellow line instead of violet line?’
Well, Google still can’t fetch results to this query as the BERT update is yet to be rolled out in India, but in the US, a similar result can be fetched easily.
Google has shared a few examples; here are the same for your reference:
- In the image attached below, you’ll see how the search query results are not even close to what the user was looking for and after the latest update, the user is able to find a close answer to its queries.
- Similarly like Image 1, in this, the search results are near to what the user is looking and not any random search results that are of no use to the user.
What do we see if we look into BERT from an SEO’s perspective?
If we want to look into it from an SEO’s (Search Engine Optimization) perspective, I suggest that we’d better dive into an expert’s opinion,Malte Landwehr, VP Product, Searchmetrics says, “Bert is a logical development for Google, following in the footsteps of Panda, Hummingbird and RankBrain. However, this time we’re not looking at a change in the way data is indexed or ranked. Instead, Google is trying to identify the context of a search query and provide results accordingly. This is an exciting addition to what context-free models like Word2Vec and Glove are able to offer. For Voice Search and Conversational Search, I would expect to see significant leaps forward in the quality of results in the near future.”
So, what should we do now to make the website rank better?
Well, there is no answer to this at this very moment, as we don’t have enough data to analyse the repercussions of the latest update. But we can suggest that you keep writing content for your website, as it may or may not affect the reach; also, as the people who are interested in your website will keep coming back and interacting with it, no matter the update.
But why is BERT so important?
According to Google, around 15% of the daily searches are new; this means that they are being searched for the very first time. Furthermore, the phrasing of the search queries is growing closer to real human semantics, especially after the Voice Search option was made available for regional languages. Another reason for BERT is the increasing length of search queries, today 70% of the searches can be considered as long-tail.
People look for answers for really long questions, which BERT will eventually be able to revert to and make lives of the people even easier.