Browse by category
Search | Advanced search
When a Web site has been banned by Google, there is no warning except for the steady drop of sales and/or visitors. The Google team does not share any information to pinpoit the reasons as to why the Web site was removed.
This article is an attempt to explain the more common reasons why a Web site gets banned by Google (this guide applies to other search engines as well).
1. Robots and Meta Tags
Make sure that your robot.txt file does not prevent search engines from entering your site. Also, make sure your meta tags are not directing search engine robots to exclude your site. Check your robot.txt file (if you have one) and your meta tags. Unless you want your site hidden, you should never have this attribute in your meta tags:
If you have this attribute, Google spider can not enter your site.
Example Robots.txt Format
Allow indexing of everything:
User-agent: * Disallow:
Disallow indexing of everything:
User-agent: * Disallow: /
To allow Googlebot to index only one file within a folder:
User-agent: Googlebot Disallow: /FOLDER_1/ Allow: /FOLDER_1/MyFILE.html
You are cloaking, if your Web site or Web pages are configured to display different information for a search engine spider versus a real person. The cloaked Web page usually contains keyword and/or phrases that gives it a higher rank.
There are good reasons for cloaking such as targeted advertising, but it is unwise in the long run to try to manipulate your rankings.
3. Duplicate copies of Content or a Web site
Make sure no other Web site has published content from your Web site. All sites with the same content can be penalized. You can do this by performing a Google search using some of your text with quotation marks (") around it. If you find a Web site is using a content from your original text, go to: Removing Content From Google and follow the instructions. You can also Search for copies of your page on the web.
4. Hidden Links and/or Text
Hidden links and/or text are displayed in such a way as to be invisible or unreadable for humans, but can be seen by search engine spiders.. This is most commonly achieved by setting the color of the font to the same color as the background, rendering the text invisible unless the user highlights it. Here is an example of a hidden text (highlight to see for yourself):
5. Keyword Spam/Stuffing
Keyword spam or stuffing is considered to be an unethical search engine optimization (SEO) technique. Keyword stuffing can be achieved by adding keywords in the meta tags or in content. The repetition of words in meta tags may explain why many search engines have comletely stopped using these tags.
Google and other search engines can place a filter to reduce the site's rankings or simply ban the site, if the word or phrase is repeated too often either in the meta tags and/or content.
6. Doorway Pages
Doorway Pages are web pages that are created for spamdexing to attract search engine spiders and get a higher ranking for their targeted keywords. Doorway Pages are also known as: entry pages, gateway pages, bridge pages, jump pages, portal pages, and various other names. Doorway Pages that redirect visitors with or without their knowledge use some form of cloaking. As you may know by now, cloaking is a bad idea as it is banned by search engines, including Google.
7. Paid links
According to Google,
"Buying and selling links is a normal part of the economy of the web when done for advertising purposes, and not for manipulation of search results. Links purchased for advertising should be designated as such. This can be done in several ways, such as":
However, buying and/or selling links to improve PageRank is in violation of Google's Webmaster Guidelines and might ban your site.
Learn more about How to: Buy Links Without Being Called a Spammer
8. Avoid Linking to Bad Neighborhoods
The following excerpt is taken from Google's Webmaster Guidelines page (Quality guidelines):
"Don't participate in link schemes designed to increase your site's ranking or PageRank. In particular, avoid links to web spammers or "bad neighborhoods" on the web as your own ranking may be affected adversely by those links."
Bad neighborhoods are Web sites that knowingly or accidentally utilize Black Hat SEO techniques intended to artificially boost a site's rankings in search engine results pages (SERPs). These Web sites are in violatation of Google's guidelines.
Avoid linking to any Web sites that use any of the techniques, mentioned in this article, to increase their search engine rankings.
FYI, Google not only penalizes a Web site that tries to cheat the system; but also penalizes Web sites that link to a penalized site.
9. Code Swapping
Code Swapping is also known as switch and bait. It refers to the act of changing a web site's content right after it has achieved high ranking in a searh engine.
What does Google say?
To avoid being banned, Google states:
"Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you. Another useful test is to ask, 'Does this help my users? Would I do this if search engines didn't exist?'"
How to get back into Google
Google will eventually spider your Web site again. You may have to wait a few months for Google to re-index your Web site, so be patient.