We are conducting business as usual – please see our COVID-19 statement here

Every Google Algorithm Update You Need to Know About

Share on facebook
Share on twitter
Share on linkedin

We all know that page rankings fluctuate in Google. New sites and content are created every day causing results to constantly change. But how does content rank to start with? The answer is simple: Google algorithms.

What are Google algorithms?

Google has rolled out several algorithms over the years. They look through every webpage in the Search index to find the most relevant results for a user’s query thanks to ranking signals like:

Different algorithms focus on different aspects of Search. Some take aim at black-hat link building, while others aim to improve local searches. It all feeds into Google’s ethos to deliver the most relevant and reliable information, and make it universally accessible.

When a new algorithm is rolled out, SEO specialists have to react quickly. Each one requires a new approach and strategy to ensure that websites don’t lose rankings.

Google’s algorithm history

BERT (2019)

Impacting a staggering one in ten searches, this Google algorithm focused on natural language processing. Short for bidirectional encoder representations from transformers, BERT means that Google is able to figure out the full context of a word by looking at what comes before and after it. This marked a big improvement in interpreting search queries and intents.

This update also affects some featured snippets as it now takes prepositions into account. Results are now more accurate and relevant to searchers’ intents.

Medic (2018)

This broad core algorithm saw shifts in ranking, although it mainly affected medical sites. No one knows what the exact purpose of this Google algorithm was, but it’s thought that it was either an attempt to improve understanding of user intent or protect users from discreditable information.

Google openly admitted that there were no quick fixes for sites that lost rankings as a result of Medic. SEOs were told to focus on “building great content” as Medic was now “benefiting pages that were previously under-rewarded”.

Mobile Speed Update (2018)

While page speed was already a ranking factor for desktop searches, this update saw it become more important for mobile queries. This paved the way for mobile-first indexing.

Google said that it would only affect pages that “deliver the slowest experience to users”, and that sites with great quality could still rank high.

Possum (2016)

This update focused on local ranking. Local results began to depend more on the actual geographic location of the searcher and how the query was phrased, causing local SEO to become much more important.

Businesses located just outside towns and cities now saw themselves included in rankings for them. The physical location of the searcher also became more important.

RankBrain (2015)

A large step towards better understanding user intent, RankBrain used machine learning to make guesses about the meaning of words and find synonyms to give relevant results. Bill Slawksi described it like this:

“To an equestrian a horse is a large four-legged animal, to a carpenter, a horse has four legs, but it doesn’t live in fields or chew hay, to a gymnast a horse is something I believe you do vaults upon. With RankBrain context matters, and making sure you capture that context is possibly a key to optimising for this machine-learning approach.”

This meant that having individual pages optimised for keyword variants was well and truly dead. Good copy became optimised for keywords and synonyms, and Google began showing more results that were relevant for the whole keyword phrase.

Mobilegeddon (2015)

By 2015, more than 50% of Google search queries were coming from mobile devices. It was time for an algorithm update to reflect this. Enter Mobilegeddon.

Mobilegeddon
Source: Google Webmaster Central Blog

Mobile-friendly pages were now given a ranking boost on mobile searches to reflect the shift in consumer behaviour. The update was rolled out worldwide, but Google made a point of saying that the “intent of the search query is still a very strong signal”. Great content could still outrank mobile-friendliness.

Pigeon (2014)

Local search received some attention with Google’s Pigeon algorithm in 2014. Affecting both results pages and Google Maps, Pigeon improved distance and location ranking parameters, and it emphasised the need for good local SEO.

As part of the update, seven-pack local listings began to be downsized. At the same time, organic ranking signals started carrying more weight if local businesses wanted to be featured.

Hummingbird (2013)

Hummingbird saw Google’s algorithms change direction. Affecting 90% of all searches, it brought semantic search to the fore to give searchers the answers they were looking for. More attention was given to each word in a query, meaning the whole phrase was taken into account. Hummingbird also laid the groundwork for voice search.

Optimising content for SEO changed as a result, and best practices began to include:

  • Diversifying content length
  • Producing visual content
  • Using topic-appropriate language
  • Using schema markup language

Pirate (2012)

While Google did release its search in Pirate, it’s also the name of an update from 2012. A signal in the rankings algorithm, its purpose was to demote sites with large numbers of valid copyright removal notices.

Instead of showing sites with illegal content, Google prioritised streaming services like Netflix and Amazon. However, the former could still be found in the SERPs with a bit of digging.

Penguin (2012)

Thought to be named after the Batman villain, Google’s Penguin algorithm saw keyword stuffers’ sites demoted. Quality was becoming the name of the game, and keyword stuffing and black-hat link building were now being punished.

While Penguin only considered a site’s incoming links, it still affected more than 3% of search results in its initial rollout. Four more versions followed over the next four years before it was finally added to the core algorithm in 2016. It had changed link building strategies forever.

Venice (2012)

A game changer for local SEO, Venice aimed to “find results from a user’s city more reliably”. Local intent became much more important for relevant searches, and was based on the location you’d set (because you could do that in 2012) or your IP address.

Small businesses were now able to rank for shorter keywords and optimising for local SEO became more common. However, it also gave rise to location stuffing. Similar to keyword stuffing, pages were filled with town and city names to make them rank higher.

Panda (2011)

If you remember the days of content farms and thin, low-quality content, you remember a time before this Google algorithm. The search engine decided to put an end to low-value, low-quality sites, permanently changing the world of SEO and rewarding original content.

Panda wasn’t a small update – it impacted 11.8% of search enquiries. The algorithm was built on 23 questions and human quality raters, and it got negative ranking signals from sites with:

  • Thin, low-quality content
  • Duplicate content
  • A lack of authority and trustworthiness
  • A high ad-to-content ratio
  • Content that didn’t match the search query

Did any of these algorithms impact your rankings? Let us know in the comments or tweet us @TeamTillison.

Leave a Reply

Your email address will not be published. Required fields are marked *