Google Webmaster tool (GWT) is the best free SEO tool you can get for your blog. For any level of Blogger, GWT will let you take care of your complete site health and helps you to implement various optimization technique for your Blog. We have covered few of these in the past, and if you are starting late here are some of the articles you can refer to:
- What are Google sitelinks and how to remove unwanted sitelinks
- How to deindex WordPress tags, categories and Attachment links from Google
- Search engine optimization of Blog using Google Webmaster tool
Few days back Google added a beautiful chart which shows the site crawl errors rate and what causes those error in webmaster tool. This indeed a very useful tool and here I’m going to quickly show you what all your can determine from these errors and possible solutions to fix all these site crawl errors warning in Google Webmaster tool.
Getting started with Site crawl Errors in GWT:
Login to your GWT account and under your site dashboard, click on Health > Crawl errors. This page will show a graph showing list of errors and what caused that errors. Since I’m not facing any such crawling issues, I’m using the image from official Blog post with various crawl errors will look something like this:
At the time of writing, Google Webmaster tool shows three types of errors:
- DNS errors
- Server connectivity
- Robots.txt Errors
Well if you see your chart growing high like shown in above image, you need to understand it’s your responsibility to fix all the problems and let bots crawl your site effectively. Below, we will discus in brief about every site errors type.
Google Site errors Type
On the same page, you will see list of links which are affected and if you solved the issues, you can select individual links and mark them as fixed.
This is like normal DNS issues which we face on our system, when our Internet connection is unable to open an address and display DNS error on browser. This usually happens because of many reasons and one of them is when you move hosting or make name server changes. Since, replication of DNS take times, it usually happens. Sometime, in many countries bots can’t access your site due to domain name server fail and you get similar error. You can always check ping status of your site in different countries using Just-ping service.
Few days back I moved ShoutMeLoud from Hostgator to Knownhost VPS, and I could see my DNS errors increasing, finally one day I received this Email from GWT:
“Over the last 24 hours, Googlebot encountered 66 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 1.5%. You can see more details about these errors in Webmaster Tools.”
Since I knew the reason for these errors, I never bothered much about it and now I have Zero DNS error. Another reasons for DNS errors in Webmaster tool is because of high latency and it’s usually issue with your Hosting. If you are constantly getting high DNS error, you should contact your hosting technical support and ask them to look into it. If they can’t fix the issue, simply move to a better hosting, instead of blaming Hosting or Google for such errors.
Server connectivity Errors:
This issue is mostly due to your server configuration, where Google bots can’t crawl your site or your site timeouts before loading the page. Which usually happens when a page take too much time to load or your server resource is exhausted. One easy way to fix this issue is controlling the crawl rate of bots. Unlikely, GWT doesn’t offer your pretty graph like Bing Webmaster tools to control the crawl timing, so you can use Webmaster Parameter tool and Robots.txt to control the crawling.
Make sure your server is not underpowered and if it is, get a hosting with more resources. If you are using self configured VPS or dedicated hosting, make sure your firewall and security settings is not blocking access to search engine bots. You can read more about this on official page here. If you are not using any caching mechanism, I would recommend to use any of these WordPress cache plugins, which will help fixing Server connectivity errors to a great level.
Robots.txt Fetch Errors:
This is most common errors and happens because of mis-configured robots.txt file. Just keep one thing in mind, Robots.txt is used to stop crawling of bots to certain area of your site which you don’t want them to index. Usually your wp-admin folder and parameters, if you are using it. I recommend not to stop crawling of your tags and categories pages using Robots.txt, as crawling and indexing are two different thing. We have already discussed a lot about this in past, and you can refer to following articles to get a complete idea of it.
Always remember some of these errors which you will be seeing are temporarily. For example, server connectivity errors could have happened because of server load and may not be permanent. But if you see the errors consistently or getting Email from GWT regarding crawling issues, you should start working on it and fix the cause for the problem.
Do let me know what all errors you see in your Google Webmaster tool Site crawl errors? What action are you taking to remove all these site errors?
Subscribe Updates, Its FREE!