There are numerous blogs and articles on the web that will tell you how to do SEO successfully (in fact there are many on our blog). Equally, however, there are a great many posts that still insist on teaching old school or even black hat practices, that will ultimately get your site into trouble.
In this post, we are going to have a serious look at some of the ones you definitely want to avoid if you want to remain or even get into Google’s good books.
What is Black Hat SEO
When it comes to catching websites that are trying to game the system and achieve rankings by less than scrupulous methods, Google is a lot smarter than you think. Tactics that are often referred to as Black Hat SEO are strategies that are not considered to be playing by the rule book or within the Google Webmaster Guidelines and if you are caught violating these guidelines (which you will be sooner or later) then the penalties can be severe, ranging from the site being identified as low quality and thus achieving lower rankings or even de-indexed and not showing in Google at all.
Search Engines have since cracked down on many things that they consider to be BAD SEO, and some of these practices were at one time considered acceptable. However, manipulating the algorithm to gain favour saw two major updates designed to punish those that practised these methods, Panda and Penguin. Two seemingly innocent creatures that struck fear into the heart of many webmasters. Since those heady days, there have been other updates designed to tackle other aspects, yet still, people preach or practice these tactics.
What Should You Avoid?
Link building has long been considered one of the cornerstones of SEO and there is a right way to do this, there is however a wrong way as well and perhaps the worst way is via Paid Links.
Often considered one of the easiest ways to quickly rank a website in SERPs, it is far more likely to land your website with a penalty and see all your hard work disappear into the abyss.
Paid links go against the grain and the natural SEO that is designed to affect organic search results. There are still companies that sell backlinks, claiming they will quickly help your site reach the top spot in Google. However, the algorithm at the heart of Google’s search engine is now consistently looking at all of the do-follow and no-follow links that make up the link profile of your website and it is getting ever smarter at filtering out what it deems spammy and low quality or purchased links and it will ultimately kill your rankings.
Backlinks garnered naturally through hard work and naturally for providing your users with better access to detailed and relevant information, or through creating partnerships in business with other like-minded parties are all good practices for SEO, so stick with these rather than trying to buy your way to the top.
Cloaking or Misdirection
Another old school tactic is what is commonly called cloaking and if discovered the search engines generally take swift action to prevent it. Cloaking is a method of providing different information to the user, depending on who is searching. It is generally done to hide the true content of a website from the search engines and invariably ends up landing the searcher on a spoof or spammy website full of sales links and quite often dangerous content via viruses etc.
Hidden text was common practice before it was made bad practice by the various search engines webmaster guidelines and was used to spam pages with hidden keywords. This was generally done by writing words and phrases on the page in the same colour text as the page background, thus rendering them invisible to the user but crawlable by the search engines.
Talking of hiding keywords on the page another tactic that was common practice but will get you in trouble today is keyword stuffing. Once again this used to be common practice and involves writing content on your site that is full of keywords that could trick search engines into thinking that the site was delivering highly relevant content with high intent. It generally doesn’t provide much relevance to the user and ends up making content appear disingenuous.
More is in my experience not always better and stuffing your content full of keywords with the sole intention of getting them in there will usually get you far less than the result you want.
In a world where the search engines are using technologies such as latent semantic indexing and viewing a page’s content as a whole, these are now checking for words and phrases that are related to the search query in order to prove relevant content.
Of course, keyword research and targeting is still an incredibly important aspect of SEO, but the content must now be relevant and of use to the reader.
Link and Article Directories/Farms
Probably not as common as they used to be as a lot of these have been closed down by previous Google algorithm updates, but they are still out there and we still see websites that come to us after being penalised for using them.
When they first came about, they used to be quite a valuable asset or resource as they allowed you to discover content that was of interest to the searcher. Article directories were a good way of sharing valuable information with a larger audience on websites that were generally categorised, so it made it easier to find relevant information.
But like many good things from the early days of SEO they soon became abused by unscrupulous users and rapidly became libraries for hundreds of valueless and spun articles until they were penalised by Google’s algorithm and in a lot of cases taking businesses website into the realms of oblivion along with them.
Avoid like the plague (or even Coronavirus), produce good content by writing articles of value and publish content that the user wants to read and others want to link to.
Duplicate and Copied Content
Everyone probably knows by now that content is King, it’s been that way for as long as I have been doing SEO, but the content is only King if it is unique. Duplicate content is a definite ‘Don’t DO’ and copying or plagiarising content is even worse.
We have said all the way through this article that Google and the other search engines are getting smarter, almost to the point of AI. They can recognise duplicate content on your own website and they can even recognise the author of a piece of work and assign relevance there instead of on your copied content on your site.
If you find an interesting article and rewrite add to and improve then that’s a way to do it where you are providing fresh and valuable insight.
Copying an article word for word and cutting and pasting entire paragraphs is legally and ethically wrong and modern search engines can easily detect plagiarism.
Likewise, duplicating content on your own website is bad practice and not a good tactic for SEO. This can often happen on websites such as eCommerce stores where you have the same product in multiple categories. This sort of duplication is easily remedied by utilising the tools provided such as canonicalisation, basically letting the search engines know which of the pages you deem to be the most important.
Ok, not the most critical of the bad things to do or leave on your site, but this and the final one we will give you are still important. Broken links are links that you used to have either on your site or that you have earned in the past by pointing at your site (finding broken links from other sites is also a good SEO strategy, but that’s one for another day) and are no longer live or valid and as such usually end in the user hitting a 404 page.
There is a lot of debate in the industry as to how damaging 404 errors are as an actual factor for Google, however, they are certainly an issue when it comes to usability of the website and the UX for the user.
They quite often occur on websites when links have been added to a website as an absolute URL and structure changes or pages move. They often arise because pages are linked and then over time they become obsolete and are removed but the links remain elsewhere on the site as an active reference.
Crawl your site regularly for broken links, there are many good and free tools that can assist with this, then fix them or redirect and improve the user’s journey on your site.
We could go on, but we will finish this article with this one. A site map is a model of a website’s content designed to help both users and search engines navigate the site. A site map can be a hierarchical list of pages (with links) organized by topic, an organization chart, or an XML document that provides instructions to search engine crawl bots. https://www.techopedia.com/definition/5393/site-map
They are designed to allow search engines and crawlers to crawl a website more efficiently and as such, it is good practice to ensure that your site has one. It is also good practice to ensure that the sitemap URL is listed within your sites robots.txt file as well as this is the first file on your website that any spider will read.
Having a strong and viable SEO strategy is a critical aspect of any online business and this is even more important if that business is in the field of e-commerce. Competition in this age on the web is strong and there is a lot of information available to help a growing business with making the right decisions to get this right and compete on a level playing field with other business.
Like I said at the beginning though there is a lot out there that can lead you astray and end with you being in a world of pain. Hopefully, these pointers will help you steer clear of the dark side and become a Jedi of the light. Generally, there is no fast solution to gain worthwhile organic traffic for your business and if it is promising something that is too good to be true, then it generally is.
Free Website Audit
Let's get started
What our client say...
“Richard and his team took a lot of time out of his day to come and visit us, see our products, see what we’re about and understand our industry. The results, they speak for themselves really.”
CEO & Founder
1 Stop Spas