Who is BERT and why is he messing with Google?

Who is BERT and why is he messing with Google?

If you’ve been keeping an eye on marketing news the past few days, you might be forgiven for wondering “Who is Bert?” 

You might have thought of the muppet who’s best friends with Ernie, or my personal favourite, the wonderfully eccentric Bert Cooper, of Sterling Cooper fame. 

In this instance, BERT is neither an ageing ad agency senior partner nor a banana-shaped puppet. Bidirectional Encoder Representations from Transformers, or BERT, to his friends, is Google’s latest innovation in natural language processing (NLP). Essentially, BERT is a way to “pre-train” a question-answer system – like a chatbot, an AI assistant, or a search engine – to maximise its understanding of context and user intent. 

Google have released BERT as an open source platform for anyone to pre-train their system or bot. You might be thinking at this point – “why should I care? I’m not trying to build a chatbot.” Well if you’re in marketing, and you like people to be able to find your content via Google, you absolutely should care. 

Because guess what – BERT doesn’t just represent an innovation in NLP or bot development, but a change in the way Google crawls, processes, understands and ranks content for a search query. Google VP of Search Pandu Nayak has called the BERT update “the single biggest change we’ve had in the last five years and perhaps one of the biggest since the beginning of the company.” So, y’know, kind of a big deal. 


The context

For nearly a decade, Google has been striving to understand both user intent, and by extension the content on the web and whether, when and how it meets this intent. Through numerous updates to its core algorithm, as well as developing proprietary machine learning and NLP models and techniques, Google has built an incredibly detailed and complex understanding of how our search queries, and the content we consume, relate to each other.

There are many fantastic examples of the way the below work which for some attempt at brevity I won’t go into here, but I would recommend reading up on to maximise your understanding. Search for any of the below and guaranteed Google will use its algorithm to correctly interpret your intent and show you some really helpful content.


Knowledge Graph

In 2012, they launched Knowledge Graph, with the snappy tagline “things, not strings”. They created a model which aimed to understand the relationship between things, rather than a thing in isolation. The Knowledge Graph is responsible for those summaries you get to the right of the search results, and for the “people also ask” section. 



Fast forward a year, and Google had released the Hummingbird update (with less bombast than they did the Knowledge Graph – they tend to be more quiet about algorithm updates). It was essentially a new “recipe” for ranking content – putting more focus on the meaning behind the words you searched for. Instead of just looking for the keywords you searched for within a piece of content, Google was now understanding whether that piece of content matched the meaning behind your words. The precedent for Hummingbird was the rise of conversational search, gearing up for the age of the Google Assistant. 



In 2015, we were introduced to RankBrain. RankBrain is a component of Google’s core algorithm which uses machine learning to enhance its understanding of search engine queries and to display the most relevant content. Rather than just using the basic algorithm with its (not so basic) 200 ranking factors – keyword usage, inbound links, domain authority being just some of them – it can now understand things like the location of the searcher, personalised results, and – you guessed it – user intent. 


Enter BERT

So, as you can see, the past 7 years or so have all been gearing up to BERT himself. But what makes BERT different? We are wandering dangerously into the territory of science and things I don’t understand here, so to avoid wading in those murky waters, I’ll say this – what you need to know is that the BERT update represents a bigger step in the direction of contextual search and user intent than ever before, meaning that whatever you were doing to produce content post Hummingbird, or post RankBrain, you need to double, triple, quadruple down on that strategy.


What does this all mean?

It’s a bit of good news, bad news situation. The bad news is, that the way we used to understand SEO – completely driven by keywords – is dead, and there’s nothing we can do about it. There is no easy way to the top. The good news is, this has been the case for a while, and if you’ve been approaching SEO and content in the right way, you’re already doing most of the things you need to to buddy up with ol’ BERT. Content should be focused on delivering value to users, rather than just SEO fodder. Rather than leading solely with keywords, use tools like Google Search Console to dig out queries people are using to find your site. What other kinds of content can you offer to them that they would find useful? Can you answer their question in a better way? 

For me, this just presents another opportunity to focus with more clarity on your user – their experience on your website, how they find, use and understand your content. These are all things you should be doing anyway, regardless of your ranking on Google. 

All that being said, the more “nitty gritty” areas of SEO are still just as important as the “light and fluffy”. It’s still important that your website and the user experience is technically sound – easy for both Google and the user to navigate and understand. Inflowing have experience doing both – if you’d like to find out how your business could be more visible on Google, get in touch