[wd_asp id=1]

Understanding Google’s Algorithm: Why Do Spam Pages Sometimes Rank?

in

Recently, we were contacted on social media by a local business who were confused about an SEO question:


“Why don’t my pages rank higher than pages on a site with heaps of duplicate content and very low user value?

We’ve all seen these sites – ‘service + location’ with a page for every one. Every page is essentially identical with just the location changed for each page. The offending sites typically have hundreds of pages with very little difference between them and very little depth of content.

What about Google’s Algorithm?

Google’s Algorithm should reward unique, deep content and penalise sites with thin or duplicate content.

Here’s an example, using Estate Agents:

  • ‘estate agent portsmouth’ has a page.
  • But, ‘estate agent cosham’ has the same page, replacing ‘Portsmouth’ with ‘Cosham’.
  • And ‘estate agent hilsea’ also has the same page, replacing ‘Portsmouth’ with ‘Hilsea’.

 Now, repeat that 1,000 times for every village, hamlet, small town or area within a city, even a postcode or two for good measure.

Every page has a unique URL, a unique H1 tag, title tag, the meta description has the keyword inserted even image alt tags.
However, the body of the content is almost identical and replacing the location in each case. That’s a lot of copying and pasting.

You may not know the locations, but Cosham and Hilsea are very small locations with a tiny amount of traffic and for many locations, there may be no traffic at all. But, that doesn’t matter – the theory is that if you create enough pages which attract a small amount of traffic to each, you’ll end up with a lot of traffic.

But it’s junk, right? And it’s really frustrating if you are an estate agent in Cosham or Hilsea. You’re being outranked by that junk, stealing traffic from your site and costing you leads.

So, why does it rank when it shouldn’t?

why-do-some-pages-rank

There are two signals at work here – relevance versus authority. Because these sites have hundreds and hundreds of (crappy) pages and probably spend a little time on creating backlinks, the domain authority is probably stronger than a small estate agent site with a few dozen pages and a few backlinks and thus, the low quality site has a stronger authority.

The (crappy) page has been optimised pretty well for the search term in question. Often, the local agent won’t have pages which are well optimised, or they’ll be optimised for ‘estate agent portsmouth’ rather than the specific local terms. The (crappy) page has a stronger relevance signal. 

Simply put, the (crappy) page, whilst it is poor quality and duplicate content, is a less poor result than the other pages that Google has crawled. There isn’t a great page to rank, so Google ranks the least irrelevant page it can find.

Here’s the good news:  

good-news-page-ranking

That strategy might work for locations and search terms with very low search volume and low competition. But it doesn’t work for much more than that. ‘Cosham’ and ‘Hilsea’ search terms might rank, but ‘Portsmouth’ search terms won’t.

Those pages are easy to outrank with a little effort:

1. You’ll need a specific page for each location, optimised for the search term – URL, title, meta description [link], H1 tag and so forth.
2. Create UNIQUE content on that page relating to that specific location and include testimonials from customers from that location and properties for sale or to let in that location.
3. If necessary, create additional content such as blog posts or review pages, pages for sold properties, pages about the local area, the schools, amenities and local employers or transport links.
4. Repeat for each location you cover.

The (small) depth of content around your optimised page should enable your site to outrank those (crappy) results and win back the traffic.Speak with a SpecialistGet a Free SEO Audit

Recent Posts
Leave a Reply

Your email address will not be published. Required fields are marked *