How to Fix Google Webmaster Tool Site Crawl Errors

Google Webmaster Tool (GWT) is the best free SEO tool you can get for your blog. For any level of blogger, GWT will allow you to address the complete health of your site, and will help you to implement various optimization techniques for your blog. We have covered few of these in the past, but in case you are just getting started, here are some articles to bring you up to speed,:

A few days back Google added a beautiful chart in their Webmaster Tool showing the site crawl errors rate and what causes those errors. This is indeed a very useful tool, and I am going to quickly show you what you can determine from these errors, and possible solutions to repair the site crawl error warnings using Google Webmaster Tool.

Getting started with site crawl errors in GWT:

Login to your GWT account, and under your site dashboard, click on Health > Crawl errors. This page will display a graph showing a list of errors and what caused those errors. Since my blog is not facing such crawling issues, I’m using the image from the official blog post. So the detection of various crawl errors will look something like this:

Site crawl errors
  • Save

At the time of the writing of this post, Google Webmaster Tool shows three types of errors:

  • DNS errors
  • Server connectivity errors
  • Robots.txt errors

If you see the errors detected in your chart growing like those shown in image displayed above, remember that it is your responsibility to fix all of these errors in an effort to allow the bots to crawl your site effectively.

Let’s have a look at the various types of possible errors:

Types of Google site errors

On the page showing you the detected errors, you will see a list of links that are affected. Once you have solved the issues, you can select individual links and mark them as fixed.

DNS errors:

These are like the normal DNS issues we face on our system when our Internet connection is unable to open an address, and displays a notice of a DNS error in the browser window. This can happen for any one of many reasons, including when you change your hosting service or make name server changes.

Since the replication of DNS take time, a DNS error often results. In many countries bots can’t access your site due to a domain name server failure, and you get a similar error as a result. Remember that you can check the ping status of your site in different countries using a service like Just-Ping.

A few days ago I moved ShoutMeLoud from Hostgator to Knownhost VPS, and I could see my DNS errors increasing.  Ultimately I received this email from GWT:

“Over the last 24 hours, Googlebot encountered 66 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 1.5%. You can see more details about these errors in Webmaster Tools.”

Since I knew the reason for these errors, I didn’t bother with it, and I currently have zero DNS errors.

DNS errors found in Webmaster Tool can also be due to high latency, and this is usually related to an issue with your hosting service. If you are constantly getting a high rate of DNS errors, you should contact your host’s technical support and ask them to look into it. If they can’t fix the issue, simply move to a better hosting service instead of blaming the host or Google for such errors.

Server connectivity errors:

This issue is generally related to your server configuration, where Google bots can’t crawl your site, or your site times-out before loading the page. This usually happens when a page requires too much time to load, or your server’s resources are exhausted. One easy way to fix this issue is to control the crawl rate of bots. GWT doesn’t offer you a pretty graph like Bing Webmaster Tools to control the crawl timing, so you can use the Webmaster Parameter Tool and Robots.txt to control the crawling.

Make sure your server is not underpowered. If it is, move to a hosting service with more resources. If you are using self-configured VPS or dedicated hosting, make sure your firewall and security settings are not blocking access to search engine bots. You can read more about this on the official page here. If you are not using a caching mechanism of some sort, I would recommend using any of these WordPress cache plugins, which will help you to deal with server connectivity errors.

Robots.txt fetch errors:

This is a common error that occurs because of misconfigured robots.txt files. Bear in mind that robots.txt is used to stop the crawling of bots to certain areas of your site which you don’t want them to index, usually your wp-admin folder and its parameters, if you are using it. I recommend not to stop the crawling of your tags and categories pages using robots.txt, as crawling and indexing are two different things. We have already written a lot about this in the past, and you can refer to the following articles for further information on this concept.

Always remember that some of the errors you will be seeing are temporary. For example, server connectivity errors can happen because of server load, and thus may not be permanent. But if you see errors consistently, or if you’re getting email from GWT regarding crawling issues, you should start working on repairing the issues you are seeing.

Let me know what errors you are seeing when using your Google Webmaster Tool.

What actions are you taking to remove your site errors?

If you find the information in this post useful, please share it with your friends and colleagues on Facebook, Twitter and Google Plus.

Was this helpful?

Thanks for your feedback!
Authored By
A Blogger, Author and a speaker! Harsh Agrawal is recognized as a leader in digital marketing and FinTech space. Fountainhead of ShoutMeLoud, and a Speaker at ASW, Hero Mindmine, Inorbit, IBM, India blockchain summit. Also, an award-winning blogger.

32 thoughts on “How to Fix Google Webmaster Tool Site Crawl Errors”

  1. Jitendra Rathore

    Hi, Harsh

    I am facing “soft 404 error” for desktop in URL errors status. Can you please help me to remove these errors.
    And I had already read google official doc for soft 404 errors, but I am not getting it. so please help me.
    Thanks for sharing your knowledge with us.

  2. sai krishnan

    hey buddy it shows me an /m and Mobile
    error.. How to fix it..Please help buddy

  3. Hendrik

    Hello Harsh,

    My DNS Error was first red and now its orange.. my google fetch still dont work when i fetch new pages.. Do you know when this will be over? I bought my site 3 weeks ago.. I deleted al the issus in webmastertools and also to sitemap of the previous owner. I dont know what to do? Do you think my site is bad? Or is this just normal? Thanks for the help!

  4. Peter Tutor

    from time to time GWT show me crawl and robots.txt error
    few days it is ok and then suddenly once again it is bad
    there are no big connectivity
    fetch as google doesnt work also (from time to time)
    is it problem of googelbot or hat else could it be?

  5. Rashid

    The post is likely understanding,my site also effected this issue and they will show my dns error 85% but my host is go daddy.i send a ticket and they say their is no issue.then what i will do…?

  6. Mircea

    Hi Harish,

    I had a big problem with my hosting, 5 days my site was down …and now I have 3,168 errors. Any ideas how to mass fix those errors ?

    1. Harsh Agrawal

      @Mircea
      Those errors are temporary and you need to ensure your website doesn’t go down again. You can resubmit Google sitemap and you will notice number of errors going down every day.

  7. Sameer Kaushik

    Hi Harish Aggarwal plz help me to solve my problem..
    First I got robots.txt errors and then DNS error… Due to these errors I lost my lots of traffic..
    what to do Plz help ?

    1. Harsh Agrawal

      @Sameer
      It’s hard to diagnose your issue without proper details. I suggest you to post your query in our forum at shoutersland.com

  8. subbareddy

    hi harsh ji help me in fixing mt problem

    Google detected a significant increase in the number of URLs that return a 404 (Page Not Found) error. Investigating these errors and fixing them where appropriate ensures that Google can successfully crawl your site’s pages.

    Recommended action
    Check the Crawl Errors page in Webmaster Tools.
    Check that pages on your site don’t link to non-existent pages. (If another site lists a broken link to your site, Google may list that URL but you may not be able to fix the error.)

    thanks

  9. Abhishek Pathak

    Thank Harsh sir for this lovely post but i want to know
    After how much time DNS error message is marked as green by googlebot.

  10. priyansh setia

    my website is faceing Server connectivity problem from last 20 days i have tried all the above mentioned steps but it has not resolved till yet…..plzz help

  11. shiban rah

    Thanks for sharing this usefull tips. .I use Screaming Frog SEO Spider to crawl my website and find the links which shows 404 errors and find invalid links

  12. Mukul

    I transferred my hosting service to bigrock 10 days ago, after that i am having server connectivity issue, my posts also are not being indexed. What to do ??

  13. Kashyap Shreepathi

    I just shifted my hosting, and I’m getting a DNS error(70%). So what should I do now? Wait? or should I contact my hosting provider?

    1. Harsh Agrawal

      @Kashyap
      If you have done the shift just now, this error is temporary…
      You can use service like just-ping.com, and check if DNS has been propagated globally or not…

  14. deewaker

    how can i create robot.txt please tell me .

  15. Catalin

    robots.txt can be edited offline should be located on your /www /html_public folder right there, if its not there then you can create it manually in Notepad

  16. Lionel

    My website shows a lot of 404 error (approx. 44 thousands ). Is 404 errors effects in SEO?

  17. Reeja Mathews

    Im also using webmaster tools but im unable to clear the errors. how can i edit robots.txt?

  18. Nizam

    This is really very useful guide on fixing crawl errors in Google webmaster tools. It’s very clear now, when a page takes too much time to load or the site timeouts before loading the page. This leads to server connectivity errors, thereby Google bots cannot crawl the site, but this may may not be permanent. Thanks Harsh for useful info 🙂

  19. akhilendra

    great tips, thanks for sharing. we often receive these errors but with no clue on how to handle this. I think configuring robots.txt is quite tricky. We may block something which should not be and the same time allow something which should be.

  20. george t mathew

    The post is quite technical in nature. I could’t make much of it. By the way how to make this robots.txt and the ht access file which you often talk about.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top