In October 2019, Google rolled out a groundbreaking change to its search ecosystem with the introduction of BERT. In fact, many authorities in the SEO world (such as Search Engine Land) deemed this to be “the largest change to Google’s search system since the company introduced RankBrain” back in 2014.
What is BERT? It stands for Bidirectional Encoder Representations from Transformers and is meant to help Google better understand the context and complexity of words that people search for in order to return results that best match the user’s search intent. Google has deemed BERT to be “one of the biggest leaps forward in the history of Search”. The models will continue to evolve, improve and be fine-tuned so that machines can better understand human language and intent through powerful Natural Language Processing (NLP) models. For more insight into BERT, check out this FAQ at SEL.
Where did BERT come from? Traditionally, computers have struggled in understanding language, which led to the birth of NLP. Key building blocks of NLP models are to understand entities, language classifications, sentiment and the deep complexities of linguistics. BERT uses vector modelling, neural networks and high computing power to harness NLP at scale with multi-directional processing to train computer models to better understand language, meaning and intents as it relates to Search queries by training systems to learn and become more accurate over time.
Can I optimize for BERT? No. This is not an explicit algorithm update focusing on a key element of ranking webpages such as Penguin, Panda or the Mobile Speed Update. With BERT, the key is writing and maintaining great content for your users by comprehensively answering questions, providing guidance based on a user’s search intent and deeply covering topics and themes related to your business and the products and services you specialize in. A unique component of BERT is that it analyzes content in a sentence or string of sentences by looking at content before and after any keyword to help understand the meaning of that keyword and uses those learnings in conjunction with millions of content strings across the web to train itself on what a topic or concept truly means. This puts significantly less emphasis on “keyword optimization” in copy that has historically been overused by marketers who get too focused on keyword repetition or density.
If you’d like an immersive, scientific dive into BERT, checkout this Search Engine Journal video interview with the highly respected Dawn Anderson.
Want to learn more about the inner workings of search engines? Checkout a comprehensive resource from Google for “How Search algorithms work”.
Our lead consultants at New Media Advisors have specialized in award-winning SEO strategies for more than two decades. We thrive in a world of constant change and have the digital insights, experience and aptitude to guide your business into the future of Organic Search to help you exceed your customer’s needs while boosting your SEO market share.