COVID-19: I Have Suspended Office Visits But I Am Still Working & Taking-On Projects. Learn More.
Blog Bad SEO Mistakes Featured Image

Bad SEO Mistakes For Small Businesses

The case of "Spammy Bill", bad seo mistakes and how all of his crappy websites lost their rankings as a result of Google's algorithm updates.

I received a call one day from a person looking for some SEO services.

He informed me that he had enjoyed excellent rankings for years, but had now suddenly lost his SERPs and needed some help.

As it turned out, he had been using some of the most common and bad seo mistakes from the late 90s and the various Google algorithm updates had finally caught up to him.

He also knew that his biggest SEO mistakes were meant to game the system, but figured he was clever enough to get away with it.

Nice try, Spammy Bill.

As the saying goes, a little bit of knowledge can be a dangerous thing and people like him are no exception: They look for the easiest route without having learned the rules first.

And as you are probably guessing, bad SEO mistakes can be the only result of such an approach.

I’ll refer to him as “Spammy Bill”, in part because I don’t want to get sued, but also because this is about showcasing bad SEO mistakes for small business websites and not about intentionally embarrassing someone.

If you have a website, the following should serve as an absolute primer on what NOT to do with your SEO.

Bad SEO Mistakes For Small Businesses
Bad SEO Mistakes For Small Businesses

Meta title tags.

Ordinarily, these are used to provide a brief introduction into what the page is about.

60 characters are enough, as is a keyword or three to support the content of the page.

Google also places emphasis on the first few words contained within this title tag, so using your company’s name at the start of the tag probably isn’t one of the most practical things to do when you’re trying to compete for a given search term.

“Spammy Bill”, however, had better ideas.

Nope, he uses the opportunity to cram in as many keywords as possible and figures that if 60 characters is good, 100 must be better.

This is the first of his many bad SEO mistakes.

If “Spammy Bill” sold apples in London, here is what his title tag would look like:

<title>His Company Name. Apples London. Looking for apples in London? We in London have apples to sell. London apples for sale. Buy London apples.</title>

Seriously.

And with no exaggeration.

Meta keyword tags.

Of all his mistakes, these were my personal favourites.

For those who may not know, the meta keyword tag unofficially died in 2009.

It was definitely useless before that, but that’s when Google, Bing and Yahoo formally acknowledged it.

The bonus award, of course, was that any search engine still involved in the meta keyword game would burn you for spamming if you went too far with it.

As a result, most people in-the-know just started to avoid them altogether.

But “Spammy Bill” has never met a best-practice he wouldn’t (egregiously) violate.

Never one to be swayed by official guidelines, some of his keyword tags were 400 words long and contained services he didn’t even offer.

For good measure, he also included the company names, telephone numbers and the URLs of his competitors. 

How’s that for misleading?

And one of his home pages contained 150 words while his meta keyword tag held more than 400.

Now, I’m not a mathematician by any stretch, but I’m quite certain you cannot mention 400 things with just 150 words.

Using our “London apples” example, here’s what his keyword tag would have looked like:

<meta name="keywords" content="London, london, apple, apples, london apple, london apples, sell apple, sell apples, apple sale, apples sale, buy apple France apples, Spain apples, Spain pears, London pears, www.spain-pears.com, 123-456-7890...">

You get the idea.

They would go on forever and start mentioning other towns, companies, telephone numbers, misspellings, URLs and anything else he thought would trick the search engines into sending him traffic.

He even told me, in a matter-of-fact kind of way, that he was simply including words and terms that his competitors offered, even if he didn’t specifically offer those things himself.

And when I informed him that meta keyword tags were now obsolete, he assured me that this wasn’t the case and that they did in fact work.

So I showed him the video where Matt Cutts says Google doesn’t place any value on them.

“Spammy Bill” still disagreed.

<Sigh>

Meta description tags.

Meta descriptions are intended to support the meta titles.

I personally use 130 characters in my meta descriptions because Google will truncate (cut-off) anything longer than that in their mobile search results.

I also think 130 characters are more than enough to convey a strong supporting message.

Technically speaking, Google will show a little more than 300 characters on desktop results, but most people aren’t going to fully read it and Google doesn’t use them in its ranking algorithms, anyway.

“Spammy Bill” looked at these limits as simple recommendations and used as many words as he could:

<meta name="description" content="Buy london apples in london because we sell london apples buy from {Company Name} with the biggest apples london seller...">

“Spammy Bill” figures he’ll just use his meta keywords again, but with a splash of verbs and prepositions to switch it all up a bit and most of his meta descriptions were well over 300 characters.

“Spammy Bill” clearly isn’t big on the concept of moderation.

Keyword stuffing & keyword density.

Not just limited to your meta tags, “keyword stuffing” includes over-cooking your page content with too many keywords or key phrases, which leads to a high “keyword density”.

Back in the day, “keyword density” was all the rage with everyone ‘declaring’ the most advantageous ratio between content and keywords.

None of this was ever confirmed, and, as you probably have already guessed, it was badly misused.

This is another outdated SEO tactic that was used to fool the search engines into thinking your page had better content than it did.

The algorithms weren’t as sophisticated as they are these days, so they really only picked-up on the keywords.

Whether the page made any actual or reasonable sense to viewers wasn’t the point and a lot of websites published some really, really bad content as a result.

Here’s an example:

Bad SEO Keyword Stuffing
This is definitely a bit of over-kill on my part and you probably haven’t seen much content that goes this far, but right away we can see what is going on.
Bad SEO Mistakes Keyword Density

Now imagine every page on a website saying exactly the same thing with only the main keywords being changed.

Bad SEO Mistakes Too Many Keywords

Starts to get old, right?

It wasn’t hard for Google to demote such websites and the special violators – the ones that offered absolutely nothing valuable to anyone – were removed outright.

Even “Spammy Bill” offers actual services, after all.

Linking multiple websites with the same hosting company.

If you are going to register a bunch of domains and link websites together (which you shouldn’t), at least try not to have them all with the same hosting company.

Shared-servers and IP addresses from one hosting company will be detected by the search engines and any value you were hoping for will probably get you penalised.

Google can tell that your websites are being hosted in the same spot and won’t be fooled.

“Spammy Bill” has dozens of websites like this.

Link pages.

Another outdated SEO technique, “link pages” provide zero SEO benefits these days but will almost certainly affect your rankings negatively if you try using them.

“Link pages” and “Link farms” are huge SEO mistakes.

Back in the day, Google decided to judge your website in part by how many people were linking to you.

It had become a sort of “Social Proof” concept, and any link, no matter how relevant, was eagerly sought.

But, and as with all SEO fads, the spammers eventually ruined it for everyone.

The links people were getting were from were low-quality websites with crappy content and the metric lost any chance of being meaningful.

It also hasn’t helped anyone’s SEO in years.

But “Spammy Bill” remembers the good old days fondly and continued to link all of his websites together, regardless of whether they were related in any way.

For good measure, he also stuffed the anchor text of those links with spammy keywords.

Linking a bunch of unrelated and crappy websites together is another one of those unnecessary bad SEO mistakes people make in the world of SEO.

Duplicate content.

The concept of why publishing duplicate content throughout your website would be a bad thing seems obvious enough, right?

Saying the same thing across different pages will tell Google that you really have nothing of interest to offer viewers.

This is true for many reasons, but for this exercise we’ll just go with metrics such as “Bounce Rate” and “Time On Page”.

If people spend a few seconds on a landing page and then hit the back button, Google will eventually decide that your page sucks.

There are cases where this won’t be so, but, generally speaking, it’s so.

Google is way past giving you credit for simply having a page online and publishing spammy, duplicate content just adds to the list of bad SEO mistakes.

But “Spammy Bill” doesn’t let silly things like metrics get in his way.

He says exactly the same thing on his pages and only changes the name of the city.

The problem with saying virtually the same thing across your website is that it could have all been said on one page.

If you provide dry cleaning services in 10 cities, having 10 different pages that say the same thing won’t help you much.

Multiple business addresses.

Bad SEO mistakes aside, it should never be said that “Spammy Bill” isn’t a smart guy.

He understands that ranking locally means you should probably be local yourself, so he decided to keep using an address in Toronto even after he moved outside the city.

I had to ask him why he thought having two different business addresses (representing one actual business location) would be a good idea.

His response?

“I want to rank in a bunch of cities”.

Fair enough.

But not very practical.

Unless you’re in a franchise or multiple-locations kind of business, having different business information across the web will really hurt your local search potential.

The business information on your website, social media accounts and business directories needs to be consistent.

As in “exactly the same”.

“Local Search” is designed to put people in touch with businesses that are close to them.

As a result, you are almost always going to rank better in the city that hosts your actual business address.

You can still rank for other areas, however other businesses that have addresses in those areas will probably rank higher for those searches than you will.

All things being relatively equal, that’s how it works.

And a customer in Scarborough probably isn’t going to appreciate that you lied about your business location.

Spammy anchor text.

“Anchor text” is the clickable text within a link, like “Don’t Click Here“.

In this case, the anchor text would be “Don’t Click Here”.

Using words that describe the content being linked to is considered good, but not so much when every link to that particular content says the same thing.

And definitely not when they’re cooked with too many keywords.

Why?

Because it’s unnatural.

Think about it for a second.

If you have 20 external links coming to your website that say exactly the same thing, what are the odds that 20 different people would choose the same words to link to your website?

It could happen, but probably not.

But this type of thing did work well years ago, which is why people like “Spammy Bill” began using it.

And his problem now is that most of those links are not only worthless, they have exposed his spammy websites for what they really are: manipulative attempts to game the system.

Website & link directories.

Another useless SEO tactic and one that some SEO companies use to this day.

Back in the day, any backlink to your website was worth having and website directories were easy enough to use: you simply added your URL and some information and, voila, you’d have a backlink.

Magical software began appearing (at a cost) online that automated the process and it was possible to create thousands of backlinks in a very short period of time.

“Spammy Bill” was not about to miss his chance to jump on this trend.

Eventually, and as expected, people began to abuse it and virtually all of such links were deemed worthless.

Being listed in those directories probably won’t hurt your rankings, but having spammy anchor text linking back to you probably will.

More importantly, they really don’t help and are largely a waste of time (and your money if you are paying an SEO company to do it).

Google already knows that those directories are of the low-quality variety, so the only real way to go is down in terms of any organic search benefits.

Google has also removed a bunch spammy directory websites from its search index and most of them were probably related to the Panda and Penguin algorithm updates (more on those below).

So, how crappy does your website have to be to be completely removed from Google’s search results altogether?

Pretty crappy.

Some estimates say as high as 50% of them were removed, although it seems more likely that it was anywhere from 20% to 50%.

Regardless of how many actually went down, the point should be well taken that having links from any of those turned out to be worthless and that relying on website directories for backlinks is generally a waste of time.

“Spammy Bill” lost a lot of his time and ‘SEO-Cred’ on that one.

The Google Algorithm updates.

For those not in the know, I should probably give a brief explanation about each of the major updates and how they affect(ed) websites.

I’ll also link to the Wikipedia articles about each one if you would like to read more.

They are definitely worth a read because they were complete SEO game-changers and showed everyone how vulnerable their rankings were to them.

Google Algorithm Updates

Google Panda (2011

The “Panda” update focused on websites with low quality content.

Websites like “Spammy Bill’s” were prime targets and became vulnerable to losing some of their higher rankings.

Ever hear the saying that “Content is King”?

This is where that started.

Read more about the Panda update on Wikipedia.

Google Penguin (2012)

The “Penguin” update went after spammy websites with poor content, stuffed keywords and irrelevant links.

Sound familiar? It should by now.

Virtually everything “Spammy Bill” had done online fell under this one.

A lot of people were affected, and rightly so.

If you apply an analogy like being on steroids, it’s all good until you get caught.

Then it’s not so good.

Read more about the Penguin update on Wikipedia.

Google Hummingbird (2013)

Based on speed and precision, this update focused on ‘semantic’ search and helped websites with well-written content.

It’s important to note that crappy websites with useless content didn’t get a boost, which is almost as bad as being penalised.

Read more about the Hummingbird update on Wikipedia.

See ALL of the Google algorithm updates here.

So, if all of those things are so bad, why do some spammy websites still get good search engine rankings?

That’s a great question.

On the one hand, Google claims to be ridding the web of useless, spammy websites.

On the other hand, sites like “Spammy Bill’s” continue to appear in the search results.

Seems like an obvious contradiction, right?

Here's the answer:

Some spammy websites will continue to get some rankings, but anyone with a more relevant website will do better than them.

Make sense?

Simply stated, I could easily create a small website that would do better than any of “Spammy Bill’s” websites, regardless of topic.

Why?

Because none of his sites say anything special, or have any authoritative backlinks, or have any weight with Google whatsoever.

Simply being indexed and coming up for some vague search terms and keywords means nothing.

Out-ranking websites like that is pretty simple and anyone who puts together a semi-entertaining and informative website on the same topic is going to do as well or better than him.

If it’s formatted properly and adheres to “Best Search Practices”, it also won’t be subject to any penalties.

He’s also easy to beat because he doesn’t rank for any competitive search terms.

He’ll come up on the first page for certain things, but, and as I’ve mentioned above, anyone can do that and anyone who does will pass him.

“Spammy Bill” should know all of this by now because he called me for this very reason.

The trouble is that he believes he just has to do more of the same.

UPDATE:

It’s been a few years since I first met “Spammy Bill” and I have periodically checked-in on his websites. 

As expected, his rankings have consistently dropped and he basically languishes at the bottom of the first page of Google’s organic results – if he’s lucky – and has next to zero presence in the local Map results.

Schedule A Free telephone Consult

Automatically schedule a specific date and time for a free, no-obligation, 30-minute telephone conversation where we will discuss your website’s performance, goals, budget & much, much more!

(It Takes 2 Minutes)