We all know that page rankings fluctuate in Google. New sites and content are created every day causing results to constantly change. But how does content rank to start with? The answer is simple: Google algorithms.
What are Google algorithms?
Google has rolled out several algorithms over the years. They look through every webpage in the Search index to find the most relevant results for a user’s query thanks to ranking signals like:
Different algorithms focus on different aspects of Search. Some take aim at black-hat link building, while others aim to improve local searches. It all feeds into Google’s ethos to deliver the most relevant and reliable information, and make it universally accessible.
When a new algorithm is rolled out, SEO specialists have to react quickly. Each one requires a new approach and strategy to ensure that websites don’t lose rankings.
Google’s algorithm history
Core Web Vitals (2021)
In June 2021, Google will began to rollout its page experience factors as a ranking signal. This includes the addition of its Core Web Vitals. These are a set of measurable vitals that are good indicators of page experience. They consist of largest contentful paint (LCP), first input delay (FID), and cumulative layout shift (CLS). To get the full low down you give visit our blog.
However, it is worth noting that as the measures of page experience evolve over time so may the Core Web Vitals.
Product reviews update (2021)
People appreciate product reviews that share in-depth research, as opposed to thin content that only summarises a bunch of products. With this in mind, Google launched its product reviews update with the aim to reward high-quality review content. Separate from its regular core updates, the overall focus of the new algorithm was to give users content that provides insightful analysis and original research. Content is written by experts or enthusiasts who know the topic well.
Passage Ranking (2021)
Google made a breakthrough in being able to better understand the relevancy of specific passages. The search engine refers to the needle-in-a-haystack information that searchers may be looking for. Google’s new passage understanding capabilities means that it can display better results for a search query. It estimates that this will improve 7% of search queries worldwide.
Its systems are able to highlight featured snippets more easily, but it also has an impact on how it ranks web pages overall.
Impacting a staggering one in ten searches, this Google algorithm focused on natural language processing. Short for bidirectional encoder representations from transformers, BERT means that Google is able to figure out the full context of a word by looking at what comes before and after it. This marked a big improvement in interpreting search queries and intents.
This update also affects some featured snippets as it now takes prepositions into account. Results are now more accurate and relevant to searchers’ intents.
This broad core algorithm saw shifts in ranking, although it mainly affected medical sites. No one knows what the exact purpose of this Google algorithm was, but it’s thought that it was either an attempt to improve understanding of user intent or protect users from discreditable information.
Google openly admitted that there were no quick fixes for sites that lost rankings as a result of Medic. SEOs were told to focus on “building great content” as Medic was now “benefiting pages that were previously under-rewarded”.
Mobile Speed Update (2018)
Google said that it would only affect pages that “deliver the slowest experience to users”, and that sites with great quality could still rank high.
This update focused on local ranking. Local results began to depend more on the actual geographic location of the searcher and how the query was phrased, causing local SEO to become much more important.
Businesses located just outside towns and cities now saw themselves included in rankings for them. The physical location of the searcher also became more important.
A large step towards better understanding user intent, RankBrain used machine learning to make guesses about the meaning of words and find synonyms to give relevant results. Bill Slawksi described it like this:
“To an equestrian a horse is a large four-legged animal, to a carpenter, a horse has four legs, but it doesn’t live in fields or chew hay, to a gymnast a horse is something I believe you do vaults upon. With RankBrain context matters, and making sure you capture that context is possibly a key to optimising for this machine-learning approach.”
This meant that having individual pages optimised for keyword variants was well and truly dead. Good copy became optimised for keywords and synonyms, and Google began showing more results that were relevant for the whole keyword phrase.
Mobile-friendly pages were now given a ranking boost on mobile searches to reflect the shift in consumer behaviour. The update was rolled out worldwide, but Google made a point of saying that the “intent of the search query is still a very strong signal”. Great content could still outrank mobile-friendliness.
Local search received some attention with Google’s Pigeon algorithm in 2014. Affecting both results pages and Google Maps, Pigeon improved distance and location ranking parameters, and it emphasised the need for good local SEO.
As part of the update, seven-pack local listings began to be downsized. At the same time, organic ranking signals started carrying more weight if local businesses wanted to be featured.
Hummingbird saw Google’s algorithms change direction. Affecting 90% of all searches, it brought semantic search to the fore to give searchers the answers they were looking for. More attention was given to each word in a query, meaning the whole phrase was taken into account. Hummingbird also laid the groundwork for voice search.
Optimising content for SEO changed as a result, and best practices began to include:
- Diversifying content length
- Producing visual content
- Using topic-appropriate language
- Using schema markup language
While Google did release its search in Pirate, it’s also the name of an update from 2012. A signal in the rankings algorithm, its purpose was to demote sites with large numbers of valid copyright removal notices.
Instead of showing sites with illegal content, Google prioritised streaming services like Netflix and Amazon. However, the former could still be found in the SERPs with a bit of digging.
Thought to be named after the Batman villain, Google’s Penguin algorithm saw keyword stuffers’ sites demoted. Quality was becoming the name of the game, and keyword stuffing and black-hat link building were now being punished.
While Penguin only considered a site’s incoming links, it still affected more than 3% of search results in its initial rollout. Four more versions followed over the next four years before it was finally added to the core algorithm in 2016. It had changed link building strategies forever.
A game changer for local SEO, Venice aimed to “find results from a user’s city more reliably”. Local intent became much more important for relevant searches, and was based on the location you’d set (because you could do that in 2012) or your IP address.
Small businesses were now able to rank for shorter keywords and optimising for local SEO became more common. However, it also gave rise to location stuffing. Similar to keyword stuffing, pages were filled with town and city names to make them rank higher.
If you remember the days of content farms and thin, low-quality content, you remember a time before this Google algorithm. The search engine decided to put an end to low-value, low-quality sites, permanently changing the world of SEO and rewarding original content.
- Thin, low-quality content
- Duplicate content
- A lack of authority and trustworthiness
- A high ad-to-content ratio
- Content that didn’t match the search query
Did any of these algorithms impact your rankings? Let us know in the comments or tweet us @TeamTillison.