If your site is at least 6 months old, has been constantly receiving a measurable traffic from Google and suddenly stopped receiving it completely, the site may have been banned.
If you site is new (less than 6 month old), has a fair amount (a few 100’s) of quality backlinks and you never received any Traffic from Google at all, the site may be either in Google’s sandbox or banned.
Those are just outward signs. By themselves they mean nothing. To check for ban, you need to conduct a detailed investigation as outlined below:
1. Check index.
Type in Google site:www.yoursite.com and see if Google returns any pages. If it does not, site may be banned. Alternative way is to check site’s PR on Google Toolbar for Chrome. If there was at least one PR update (usually happens every 3 months) since your site was launched, and PR indicator on the Toolbar is grey – your site is in not in index.
Type in Google link: www.yoursite.com and see if Google returns any links pointing to your site. If it does not, and you know for sure that external links to your site do exist, this is a strong indication of Google ban.
3. Check domain name in search.
Type in google “www.yoursite.com” . Your site should appear on the 1st page of Google SEPRs, usually at 1st or one of the top places.If it does not, it’s likely to be banned.
4. Check web server logs.
The server that hosts your website creates and maintains special files called logs. Those files contain the full statistics of all kinds of activity and performance of your server, as well as traces of all problems and/or errors that might have arisen. The activities recorded in the logs include number of visitors, their geography, traffic distribution over the time of day, visited pages and time spent on them, words searched on your site, queries by which your site has been found in Search Engines and many, many other very useful data.
However, one particularly interesting information contained in the log files is which Search Engine robots have been crawling your site. If you analyse the log files and find that Google bot crawling frequency has abruptly dropped (particularly, if it dropped to zero) and the bot has not been visiting your site lately at all, this is a strong indication of ban.
However, except ban, there may be also technical and internal site problems that prevents the site from being crawled. To distinguish between the two factor, you will need to analyse your site (see the next section).
Be aware though, that log files are written in a special technical language, and unless you are an experienced system administrator, you won’t be able to read them. To decipher the log files your need a special 3rd party software – weblog analyzer . The best software I know and recommend is Advantage Web Log Analyzer. It is easy to use and includes all functions and necessary features that allow you to perform the most comprehensive log analysis ever required.
Notice that Advantage Web Log Analyzer is not only helpful in analysing the regular server stats mentioned above, it can also track the visitor activity and paths thorough various pages on your site and, thus, to find out where potential buyers/clients drop off and thus helps you improve your site conversion and ROI.
5. Check and fix internal site problems.
There might be a number of technical/coding problems with internal structure of your site.
- Robots.txt exclusions – some paths within your site could be blocked by instructions in the robots.txt file, prohibiting indexation of some pages and thus blocking the crawler path through your site.
- Navigation problems – dead internal links, erroneous pages, password protected pages etc.
- Inappropriately setup redirects in .htaccess
- URL format – too long dynamic URL’s, especially including session IDs, could cause problems with site indexation or even lead to ban (due to duplicate content issues).
Any of the above can make it difficult for the crawler to travel through your site or even block them off completely, thus preventing proper indexing of your site and, as a result, strongly affecting site rankings on Google.
6. Reviewing and correcting errors.
If you found any of the problems listed in item 5, get them fixed and resubmit site to Google. If, however, you failed to find any internal problems, and one ore more of the features described in items 1 to 4 have been discovered, this is a strong indication that your site has been banned by Google. In that case, you will need to analyse your site content and recent SEO activity. Possible reasons for ban are duplicate content or unethical/black-hat SEO practices such as paid links, cloaking, link farms or doorway pages redirecting traffic to your site.
If you find that any of the above activities have taken place on your site (either done by you or an external SEO company), rectify all of those problems as soon as possible.
7. Resubmitting site to Google.
Fix all the issues listed in item 6, then submit your site for reconsideration via Google webmaster panel or send email directly to firstname.lastname@example.org. Be sure to state the reasons why you believe your site has been banned and briefly review the actions you have taken to fix those problems. Make sure your site now complies with Google webmaster guidelines. Make an apology, promise that these kind of things will never happen again and kindly request to re-include your site in Google index.
Google support team usually don’t reply to such requests, therefore after resubmitting your site, you will need to monitor the site indexation yourself. Keep your fingers crossed and hopefully your site will soon be crawled and re-appear in Google index, and you will finally enjoy natural search traffic from Google.
For more information on possible reasons of removing from Google index see also – Site removed from the Google index.
Your feedback is welcome!
If you have any comments or questions on Google ban, don’t hesitate to leave a comment below.
If the Post is Useful Please Share with your friends 🙂