We all know that page rankings fluctuate in Google. New sites and content are created every day causing results to constantly change. But how does content rank to start with? The answer is simple: Google algorithms.
Google has rolled out several algorithms over the years. They look through every webpage in the Search index to find the most relevant results for a user’s query thanks to ranking signals like:
Different algorithms focus on different aspects of Search. Some take aim at black-hat link building, while others aim to improve local searches. It all feeds into Google’s ethos to deliver the most relevant and reliable information, and make it universally accessible.
When a new algorithm is rolled out, SEO specialists have to react quickly. Each one requires a new approach and strategy to ensure that websites don’t lose rankings.
Following on from Google’s emphasis on its core web vitals and page experience. It finally rolled out the 2021 update on desktop. The 2022 page experience update primarily affect ranking signals and the Core Web Vitals metrics. These are signals like the HTTPS connection that ensures your website has a secure connection and the absence of page elements that block the user’s view of the main page content.
On 17th November 2021, Google announced the rollout of its latest core update. Taking between one and two weeks to roll out, the broad core updates have a big impact, with rankings changing as new content and old content are assessed together. Sites that are affected can look to improve their rankings by analysing the content present on the site and ensuring it is of the highest quality.
This update had a relatively gentle effect on the search results, leading many to think it was primarily an infrastructure update. However, the update appears to have affected sectors including:
The second part of the broad core update was released on 1st July 2021 and finished the update from the beginning of the previous month. This was a smaller update than the June update, although changes were felt much more quickly than with the June update. This affected sectors including:
Google implemented a two-part spam update on the 23rd and 28th June 2021 as part of their regular work to improve results. While it’s not clear exactly what kind of spam tactics Google was targeting, spam updates are a reminder to SEOs to avoid black hat tactics to avoid losing rankings in updates like this.
The page experience update was initiated on June 15th and rolled out through to the end of August. This update added to the Core Web Vitals, with cumulative layout shift (CLS) and was concerned with user experience.
The June 2021 Core Update was the first of two parts. The rest of the update was released in July 2021 because Google needed more time to finish it. It took 10 days for this part to roll out completely. The update saw sites with thin content being affected most as well as some specific sectors being hit. The sectors affected by this update include:
In June 2021, Google began to roll out its page experience factors as a ranking signal. This includes the addition of its Core Web Vitals. These are a set of measurable vitals that are good indicators of page experience. They consist of largest contentful paint (LCP), first input delay (FID), and cumulative layout shift (CLS). To get the full low down you give visit our blog.
However, it is worth noting that as the measures of page experience evolve over time so may the Core Web Vitals.
People appreciate product reviews that share in-depth research, as opposed to thin content that only summarises a bunch of products. With this in mind, Google launched its product reviews update with the aim to reward high-quality review content. Separate from its regular core updates, the overall focus of the new algorithm was to give users content that provides insightful analysis and original research. Content is written by experts or enthusiasts who know the topic well.
Google made a breakthrough in being able to better understand the relevancy of specific passages. The search engine refers to the needle-in-a-haystack information that searchers may be looking for. Google’s new passage understanding capabilities means that it can display better results for a search query. It estimates that this will improve 7% of search queries worldwide.
Its systems are able to highlight featured snippets more easily, but it also has an impact on how it ranks web pages overall.
Impacting a staggering one in ten searches, this Google algorithm focused on natural language processing. Short for bidirectional encoder representations from transformers, BERT means that Google is able to figure out the full context of a word by looking at what comes before and after it. This marked a big improvement in interpreting search queries and intents.
This update also affects some featured snippets as it now takes prepositions into account. Results are now more accurate and relevant to searchers’ intents.
This broad core algorithm saw shifts in ranking, although it mainly affected medical sites. No one knows what the exact purpose of this Google algorithm was, but it’s thought that it was either an attempt to improve understanding of user intent or protect users from discreditable information.
Google openly admitted that there were no quick fixes for sites that lost rankings as a result of Medic. SEOs were told to focus on “building great content” as Medic was now “benefiting pages that were previously under-rewarded”.
There’s no “fix” for pages that may perform less well other than to remain focused on building great content. Over time, it may be that your content may rise relative to other pages.— Google SearchLiaison (@searchliaison) March 12, 2018
There’s no “fix” for pages that may perform less well other than to remain focused on building great content. Over time, it may be that your content may rise relative to other pages.
While page speed was already a ranking factor for desktop searches, this update saw it become more important for mobile queries. This paved the way for mobile-first indexing.
Google said that it would only affect pages that “deliver the slowest experience to users”, and that sites with great quality could still rank high.
This update focused on local ranking. Local results began to depend more on the actual geographic location of the searcher and how the query was phrased, causing local SEO to become much more important.
Businesses located just outside towns and cities now saw themselves included in rankings for them. The physical location of the searcher also became more important.
A large step towards better understanding user intent, RankBrain used machine learning to make guesses about the meaning of words and find synonyms to give relevant results. Bill Slawksi described it like this:
“To an equestrian a horse is a large four-legged animal, to a carpenter, a horse has four legs, but it doesn’t live in fields or chew hay, to a gymnast a horse is something I believe you do vaults upon. With RankBrain context matters, and making sure you capture that context is possibly a key to optimising for this machine-learning approach.”
This meant that having individual pages optimised for keyword variants was well and truly dead. Good copy became optimised for keywords and synonyms, and Google began showing more results that were relevant for the whole keyword phrase.
By 2015, more than 50% of Google search queries were coming from mobile devices. It was time for an algorithm update to reflect this. Enter Mobilegeddon.
Mobile-friendly pages were now given a ranking boost on mobile searches to reflect the shift in consumer behaviour. The update was rolled out worldwide, but Google made a point of saying that the “intent of the search query is still a very strong signal”. Great content could still outrank mobile-friendliness.
Local search received some attention with Google’s Pigeon algorithm in 2014. Affecting both results pages and Google Maps, Pigeon improved distance and location ranking parameters, and it emphasised the need for good local SEO.
As part of the update, seven-pack local listings began to be downsized. At the same time, organic ranking signals started carrying more weight if local businesses wanted to be featured.
Hummingbird saw Google’s algorithms change direction. Affecting 90% of all searches, it brought semantic search to the fore to give searchers the answers they were looking for. More attention was given to each word in a query, meaning the whole phrase was taken into account. Hummingbird also laid the groundwork for voice search.
Optimising content for SEO changed as a result, and best practices began to include:
While Google did release its search in Pirate, it’s also the name of an update from 2012. A signal in the rankings algorithm, its purpose was to demote sites with large numbers of valid copyright removal notices.
Instead of showing sites with illegal content, Google prioritised streaming services like Netflix and Amazon. However, the former could still be found in the SERPs with a bit of digging.
Thought to be named after the Batman villain, Google’s Penguin algorithm saw keyword stuffers’ sites demoted. Quality was becoming the name of the game, and keyword stuffing and black-hat link building were now being punished.
While Penguin only considered a site’s incoming links, it still affected more than 3% of search results in its initial rollout. Four more versions followed over the next four years before it was finally added to the core algorithm in 2016. It had changed link building strategies forever.
A game-changer for local SEO, Venice aimed to “find results from a user’s city more reliably”. Local intent became much more important for relevant searches and was based on the location you’d set (because you could do that in 2012) or your IP address.
Small businesses were now able to rank for shorter keywords and optimising for local SEO became more common. However, it also gave rise to location stuffing. Similar to keyword stuffing, pages were filled with town and city names to make them rank higher.
If you remember the days of content farms and thin, low-quality content, you remember a time before this Google algorithm. The search engine decided to put an end to low-value, low-quality sites, permanently changing the world of SEO and rewarding original content.
Panda wasn’t a small update – it impacted 11.8% of search enquiries. The algorithm was built on 23 questions and human quality raters, and it got negative ranking signals from sites with:
Did any of these algorithms impact your rankings? Let us know in the comments or tweet us @TeamTillison.
Your email address will not be published. Required fields are marked *
Save my name, email, and website in this browser for the next time I comment.
Get early access to digital marketing news and all the highlights from the Tillison blog.