Often I’ll get calls for help from website owners when their traffic has mysteriously dropped and not recovered for some time. The site had good ranking for a generic search phrase but now it’s nowhere to be found in the Google search results. Often the cause is an algorithm change by Google such as a new
filter. But sometimes the site owner has done something to violate the Google Guidelines for Webmasters and attract a penalty.
Ban – the ultimate punishment
Google and other engines sometimes ban a site by manually removing it from the index. This is usually for some serious breach of the search engine’s guidelines. You can tell if Google has removed you from the index by using the site: search operator which searches specifically for pages on the site. If this results in a âmatched no documentsâ error message then Google has turned into a non-site and you’ll need to work through the guideliness to find out where you are going wrong. If you see even one page of the site in the results then it is not a ban and you need to treat it as a filter.
Below is a list of penalties that Google might impose on your site:
Duplicate Content Penalty
Google does not like sites with duplicate content where the main purpose appears to be manipulating the search engine rankings by listing the same content over and over to get higher ranking for some keywords. But some novice web designers cheat their clients by copying content from elsewhere. Some customers even ask for this copying to happen. Either way it is a breach of copyright and can get the site removed from the search engines (which scares the SEO experts more than the legal implications!).
How to Avoid Duplicate Content
- Use Unique Content. This is the obvious recommendation. When there is an overwhelming desire to copy content (usually in multinational companies) try to be at least 30% unique on each page (there is no precise percentage â more is better).
- Use 301 Permanent Redirects. Many people buy additional domains and then forward them incorrectly. While they think they have multiple websites Google chooses one of them as the official one and ignores the rest. This can be fixed by placing a 301 redirect from the secondary sites to the one and only official site.
- Use CNAME aliasing correctly. CNAME records in the DNS are canonical domain name aliases. For example www.example.com may lead to the same web page as example.com or server2.example.com. This becomes an issue for SEO if others link to the âwrongâ alias.
- Avoid Repeating Disclaimers. Some sites repeat a disclaimer or some boilerplate text on every page. These can be placed in an Iframe.
- Block Appropriately. Some sites have a âprinter-friendlyâ or mobile copy of their pages which should be blocked via robots.txt or a noindex meta tag to avoid a duplicate penalty.
- Understand Your Architecture. Some CMSs forums and directories can display the same page through several URLs which can lead to a duplicate penalty. A meta robots noindex tag should be on any page that requires a user to be logged in to view the content.
- Syndicate Articles Carefully. Make sure people who copy your content link back to you otherwise you could find that their copy is treated as the official one and yours is deprecated.
Solving a Duplicate Penalty
- Place a 301 redirect to the desired site.
- Or build a new site with unique content at the other domain. If the content is complementary it would make a great source of links to the main site.
Filing a Reconsideration Request
If you suspect a Google spam penalty and have cleaned up your site you need to file a Request for Reconsideration from the Webmaster Tools Dashboard.
Instead of appearing in the Google’s main search results your site might instead start showing up in Supplemental Results. In Google this is a secondary index and is shown only when there aren’t enough results in the main index. Sites end up in the Supplemental Index when pages have duplicate content insufficient links or insufficient PageRank.
They can be forwarded pages or 404s that used to have their original content. These disappear from the index after one year. Duplicate pages that still show content are hard to get out of Google and are not often visited by the Googlebot.
Recovering from Supplemental Results
Supplemental results are not a penalty for violating the Google Guidelines for Webmasters. The challenge is to obtain quality links to inner pages which is not practical if you have thousands of pages. Some tests suggest that if you buy an expired domain name and host it with a 301 redirect to the Supplemental or âsandboxedâ site the problem will go away.
There was a theory in 2006 and earlier that all new websites with new domain names were deliberately not ranking for six months. It has since been proven that there is no Google sandbox but there are filters that prevent some sites from appearing for some time.
If you use a shared web host that is run by an inept operator you might find that someone has inserted links to bad neighbourhoods by using Cross-Site Scripting (XSS) and other techniques. This can earn a ranking penalty.
Recently Google has started demoting sites that sell links so their customers’ sites also go down. The only solution here is to get natural links (or natural-looking links). One paid links provider provides code that lists the expiry date of those links â this is an easy way for a spider to find and demote them.
Risky SEO Tactics that might (or might not) get you banned
There are a few so-called SEO tricks that no longer work or will not work once the search engines can filter them algorithmically. Some of them are known as the tools of trade of âblack hatâ operators. Others are âlegitimateâ practices that work and are legitimate because the search engines appear to be oblivious to them so far.
When you look for every opportunity to load keywords (typically in the code) and exceed the recommended limits you could be accused of keyword stuffing. The general advice is don’t.
<!– this is a comment. keyword1 keyword2 –>
Comments are not useful for improving rankings because they are meant to be ignored. Stuffing keywords in them will not help and will needlessly weigh down the page.
Cloaking means showing one page to the search engine spider and another to humans. At its worst the search engine might list some innocuous content while you see a very offensive page when you click the result link.
You can confirm that cloaking is being used by comparing the website with the version frozen in the Cached copy in Google. BMW Germany was caught doing this in early 2006 and was used as a warning to others by removing it from the index for a few days.
When bmw.de was banned it couldn’t be seen in Google. (It should have been the first entry)
Same Coloured Text as Background
In the early days of SEO you could stuff keywords on a page and make them invisible by colouring the text the same as the background e.g. white text on a white page. You can sense this is happening when there appears to be a lot of blank space on a page. By selecting âallâ on the web page (press Ctrl and A) the hidden text is revealed. If you are solving a ranking problem this is something you can look for.
Hidden text is not always the same as using same-coloured text as the background. It can be achieved through the cascading style sheet (CSS) so that the text is not displayed (but is there). Its position can also be outside the screen using negative coordinates. Such tricks are a sign of laziness â the same effort could be applied towards adding visible text to a page.
Reciprocal links are recommended only between highly complementary sites e.g. a real estate agent in North Sydney linking to another real estate agent in South Sydney.
To get around the problem of two non-complementary sites needing links some people will say âIf you link to my site I will get a third-party site to link to you.â After all how would Google know that the third-party link to your site has anything to do with your link to mine? If the âthird-partyâ domain name is owned by the same person and both are on the same IP address it is easy to make this assumption.
Investigating an IP Address
For many years Google has had access to âwhoisâ records which show who owns a domain name (unless it is a private registration). There are websites that will show you detailed information such as all the sites hosted at a single IP address. Why is this useful?
If you rented a dedicated web server which is a computer in a data centre that is entirely yours to use it is reasonable to assume that all websites on that IP address are within your control and are probably owned by you.
For example ACP owns many well-known magazine properties such as Cleo Cosmopolitan FHM and many others (none of which are doing anything questionable but we are using them as an example). MyIPNeighbors.com shows all the domain names hosted at that IP address (only if their robot has crawled that IP address).
Yes as soon as you buy a domain name the spider from MyIPNeighbors.com will test it to see if a website exists. If so it will be added to their knowledgebase.
The US government copyright agency also shows submissions from online service providers who have listed all their active domain names. The companies who are listed here are protected by the Safe Harbor provisions of the US Digital Millennium Copyright Act (DMCA) of 2003. ISPs or web hosting companies can avoid prosecution if one of their users posts material that belongs to someone else. Of course this is a very small list of companies but it all helps. There could be many other resources available to search engines that can help to corroborate ownership of domain names.
Artificial Link Networks
Some SEO companies are known to create artificial link networks for their clients and to help others who can help in some way such as by providing a reciprocal link. One implementation is for the client site to have a page full of anchor-text links to other non-competing websites. Say the client site is a real estate agent in Sydney so their ânetworkâ page will contain links such as:
- Home for sale Melbourne
- Kew East property
- House and land packages Melbourne
- Croydon real estate agent and so on
The tactic here is strengthening various popular keyphrases for specific websites. This is a legitimate SEO technique when a given keyphrase is the anchor text for links from several unrelated websites. However when several websites repeat a very similar set of links none of which point at their own pages it looks like a three-way link network on steroids.
Also known as Run-of-Site (RoS) links sitewide links appear on all pages of a given site. They are usually sold as advertising therefore they appear not to be âlink buyingâ which is likely to get you banned by Google.
So how do you distinguish between link buying and advertisements? Any ban by Google or other search engine is likely to be on a case-by-case manual basis. A RoS listing that uses keyphrase anchor text and a clean link is a possible ban candidate. Since genuine ad companies are likely to count clicks or impressions they may escape the link through a counting script.
The situation with bought links is similar to RoS linking except that there is usually just one link per customer and it is often on the home page. Such a link can be within a blog posting that purports to be a review. Again clean anchor text is a giveaway and bans are on a case-by-case basis.
Google’s Help pages say âBuying or selling links that pass PageRank is in violation of Google’s webmaster guidelines and can negatively impact a site’s ranking in search results.â Tread carefully.
Recourse if your site is banned
Create or use a
Google account and login to Google’s Webmaster
Tools. From there request reconsideration of your site. See below
for Google’s own video advising on what to do.
ABOUT THE AUTHOR
Ash Nallawalla manages natural and paid search traffic for some of
Australia’s largest websites. He is an internationally acknowledged
expert in search marketing and he previously operated an SEO training
business. This article series is excerpted from his training notes. Ash
can be contacted at firstname.lastname@example.org
and he blogs at