How To Fix Google Webmaster Tool Site Crawl Errors


How To Fix Google Webmaster Tool Site Crawl Errors

Google Webmaster tool (GWT) is the best free SEO tool you can get for your blog. For any level of Blogger, GWT will let you take care of your complete site health and helps you to implement various optimization technique for your Blog. We have covered few of these in the past, and if you are starting late here are some of the articles you can refer to:

Few days back Google added a beautiful chart which shows the site crawl errors rate and what causes those error in webmaster tool. This indeed a very useful tool and here I’m going to quickly show you what all your can determine from these errors and possible solutions to fix all these site crawl errors warning in Google Webmaster tool.

Getting started with Site crawl Errors in GWT:

Login to your GWT account and under your site dashboard, click on Health > Crawl errors. This page will show a graph showing list of errors and what caused that errors. Since I’m not facing any such crawling issues, I’m using the image from official Blog post with various crawl errors will look something like this:

Site crawl errors

At the time of writing, Google Webmaster tool shows three types of errors:

  • DNS errors
  • Server connectivity
  • Robots.txt Errors

Well if you see your chart growing high like shown in above image, you need to understand it’s your responsibility to fix all the problems and let bots crawl your site effectively. Below, we will discus in brief about every site errors type.

Google Site errors Type

On the same page, you will see list of links which are affected and if you solved the issues, you can select individual links and mark them as fixed.

DNS Errors:

This is like normal DNS issues which we face on our system, when our Internet connection is unable to open an address and display DNS error on browser. This usually happens because of many reasons and one of them is when you move hosting or make name server changes. Since, replication of DNS take times, it usually happens. Sometime, in many countries bots can’t access your site due to domain name server fail and you get similar error. You can always check ping status of your site in different countries using Just-ping service.

Few days back I moved ShoutMeLoud from Hostgator to Knownhost VPS, and I could see my DNS errors increasing, finally one day I received this Email from GWT:

“Over the last 24 hours, Googlebot encountered 66 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 1.5%. You can see more details about these errors in Webmaster Tools.”

Since I knew the reason for these errors, I never bothered much about it and now I have Zero DNS error. Another reasons for DNS errors in Webmaster tool is because of high latency and it’s usually issue with your Hosting. If you are constantly getting high DNS error, you should contact your hosting technical support and ask them to look into it. If they can’t fix the issue, simply move to a better hosting, instead of blaming Hosting or Google for such errors.

Server connectivity Errors:

This issue is mostly due to your server configuration, where Google bots can’t crawl your site or your site timeouts before loading the page. Which usually happens when a page take too much time to load or your server resource is exhausted. One easy way to fix this issue is controlling the crawl rate of bots. Unlikely, GWT doesn’t offer your pretty graph like Bing Webmaster tools to control the crawl timing, so you can use Webmaster Parameter tool and Robots.txt to control the crawling.

Make sure your server is not underpowered and if it is, get a hosting with more resources. If you are using self configured VPS or dedicated hosting, make sure your firewall and security settings is not blocking access to search engine bots. You can read more about this on official page here. If you are not using any caching mechanism, I would recommend to use any of these WordPress cache plugins, which will help fixing Server connectivity errors to a great level.

Robots.txt Fetch Errors:

This is most common errors and happens because of mis-configured robots.txt file. Just keep one thing in mind, Robots.txt is used to stop crawling of bots to certain area of your site which you don’t want them to index. Usually your wp-admin folder and parameters, if you are using it. I recommend not to stop crawling of your tags and categories pages using Robots.txt, as crawling and indexing are two different thing. We have already discussed a lot about this in past, and you can refer to following articles to get a complete idea of it.

Always remember some of these errors which you will be seeing are temporarily. For example, server connectivity errors could have happened because of server load and may not be permanent. But if you see the errors consistently or getting Email from GWT regarding crawling issues, you should start working on it and fix the cause for the problem.

Do let me know what all errors you see in your Google Webmaster tool Site crawl errors? What action are you taking to remove all these site errors?

Subscribe on Youtube

Click to activate Offer & visit site Discount added automatically

Ultimate Blogging Strategies

Sign Up Today For Free

join our site and get free content delivered automatically each time we publish

100% privacy. we will never spam you

No related entires
  • Author Bio

  • Latest Post

Article by Harsh Agrawal

Harsh has written 1043 articles.

If you like This post, you can follow ShoutMeLoud on Twitter. Subscribe to ShoutMeLoud feed via RSS or EMAIL to receive instant updates.

Send Us Inquiry


  1. says

    Hello Harsh,

    My DNS Error was first red and now its orange.. my google fetch still dont work when i fetch new pages.. Do you know when this will be over? I bought my site 3 weeks ago.. I deleted al the issus in webmastertools and also to sitemap of the previous owner. I dont know what to do? Do you think my site is bad? Or is this just normal? Thanks for the help!

  2. says

    from time to time GWT show me crawl and robots.txt error
    few days it is ok and then suddenly once again it is bad
    there are no big connectivity
    fetch as google doesnt work also (from time to time)
    is it problem of googelbot or hat else could it be?

  3. Rashid says

    The post is likely understanding,my site also effected this issue and they will show my dns error 85% but my host is go daddy.i send a ticket and they say their is no issue.then what i will do…?

  4. says

    Hi Harish,

    I had a big problem with my hosting, 5 days my site was down …and now I have 3,168 errors. Any ideas how to mass fix those errors ?

    • says

      Those errors are temporary and you need to ensure your website doesn’t go down again. You can resubmit Google sitemap and you will notice number of errors going down every day.

  5. says

    Hi Harish Aggarwal plz help me to solve my problem..
    First I got robots.txt errors and then DNS error… Due to these errors I lost my lots of traffic..
    what to do Plz help ?

  6. subbareddy says

    hi harsh ji help me in fixing mt problem

    Google detected a significant increase in the number of URLs that return a 404 (Page Not Found) error. Investigating these errors and fixing them where appropriate ensures that Google can successfully crawl your site’s pages.

    Recommended action
    Check the Crawl Errors page in Webmaster Tools.
    Check that pages on your site don’t link to non-existent pages. (If another site lists a broken link to your site, Google may list that URL but you may not be able to fix the error.)


  7. Abhishek Pathak says

    Thank Harsh sir for this lovely post but i want to know
    After how much time DNS error message is marked as green by googlebot.

  8. priyansh setia says

    my website is faceing Server connectivity problem from last 20 days i have tried all the above mentioned steps but it has not resolved till yet…..plzz help

  9. says

    Thanks for sharing this usefull tips. .I use Screaming Frog SEO Spider to crawl my website and find the links which shows 404 errors and find invalid links

  10. Mukul says

    I transferred my hosting service to bigrock 10 days ago, after that i am having server connectivity issue, my posts also are not being indexed. What to do ??

    • says

      If you have done the shift just now, this error is temporary…
      You can use service like, and check if DNS has been propagated globally or not…

  11. Catalin says

    robots.txt can be edited offline should be located on your /www /html_public folder right there, if its not there then you can create it manually in Notepad

  12. says

    This is really very useful guide on fixing crawl errors in Google webmaster tools. It’s very clear now, when a page takes too much time to load or the site timeouts before loading the page. This leads to server connectivity errors, thereby Google bots cannot crawl the site, but this may may not be permanent. Thanks Harsh for useful info :)

  13. akhilendra says

    great tips, thanks for sharing. we often receive these errors but with no clue on how to handle this. I think configuring robots.txt is quite tricky. We may block something which should not be and the same time allow something which should be.

  14. george t mathew says

    The post is quite technical in nature. I could’t make much of it. By the way how to make this robots.txt and the ht access file which you often talk about.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>