How to Fix Google Webmaster Tool Site Crawl Errors

Google Webmaster Tool (GWT) is the best free SEO tool you can get for your blog. For any level of blogger, GWT will allow you to address the complete health of your site, and will help you to implement various optimization techniques for your blog. We have covered few of these in the past, but in case you are just getting started, here are some articles to bring you up to speed,:

A few days back Google added a beautiful chart in their Webmaster Tool showing the site crawl errors rate and what causes those errors. This is indeed a very useful tool, and I am going to quickly show you what you can determine from these errors, and possible solutions to repair the site crawl error warnings using Google Webmaster Tool.

Getting started with site crawl errors in GWT:

Login to your GWT account, and under your site dashboard, click on Health > Crawl errors. This page will display a graph showing a list of errors and what caused those errors. Since my blog is not facing such crawling issues, I’m using the image from the official blog post. So the detection of various crawl errors will look something like this:

Site crawl errors
  • Save

At the time of the writing of this post, Google Webmaster Tool shows three types of errors:

  • DNS errors
  • Server connectivity errors
  • Robots.txt errors

If you see the errors detected in your chart growing like those shown in image displayed above, remember that it is your responsibility to fix all of these errors in an effort to allow the bots to crawl your site effectively.

Let’s have a look at the various types of possible errors:

Types of Google site errors

On the page showing you the detected errors, you will see a list of links that are affected. Once you have solved the issues, you can select individual links and mark them as fixed.

DNS errors:

These are like the normal DNS issues we face on our system when our Internet connection is unable to open an address, and displays a notice of a DNS error in the browser window. This can happen for any one of many reasons, including when you change your hosting service or make name server changes.

Since the replication of DNS take time, a DNS error often results. In many countries bots can’t access your site due to a domain name server failure, and you get a similar error as a result. Remember that you can check the ping status of your site in different countries using a service like Just-Ping.

A few days ago I moved ShoutMeLoud from Hostgator to Knownhost VPS, and I could see my DNS errors increasing.  Ultimately I received this email from GWT:

“Over the last 24 hours, Googlebot encountered 66 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 1.5%. You can see more details about these errors in Webmaster Tools.”

Since I knew the reason for these errors, I didn’t bother with it, and I currently have zero DNS errors.

DNS errors found in Webmaster Tool can also be due to high latency, and this is usually related to an issue with your hosting service. If you are constantly getting a high rate of DNS errors, you should contact your host’s technical support and ask them to look into it. If they can’t fix the issue, simply move to a better hosting service instead of blaming the host or Google for such errors.

Server connectivity errors:

This issue is generally related to your server configuration, where Google bots can’t crawl your site, or your site times-out before loading the page. This usually happens when a page requires too much time to load, or your server’s resources are exhausted. One easy way to fix this issue is to control the crawl rate of bots. GWT doesn’t offer you a pretty graph like Bing Webmaster Tools to control the crawl timing, so you can use the Webmaster Parameter Tool and Robots.txt to control the crawling.

Make sure your server is not underpowered. If it is, move to a hosting service with more resources. If you are using self-configured VPS or dedicated hosting, make sure your firewall and security settings are not blocking access to search engine bots. You can read more about this on the official page here. If you are not using a caching mechanism of some sort, I would recommend using any of these WordPress cache plugins, which will help you to deal with server connectivity errors.

Robots.txt fetch errors:

This is a common error that occurs because of misconfigured robots.txt files. Bear in mind that robots.txt is used to stop the crawling of bots to certain areas of your site which you don’t want them to index, usually your wp-admin folder and its parameters, if you are using it. I recommend not to stop the crawling of your tags and categories pages using robots.txt, as crawling and indexing are two different things. We have already written a lot about this in the past, and you can refer to the following articles for further information on this concept.

Always remember that some of the errors you will be seeing are temporary. For example, server connectivity errors can happen because of server load, and thus may not be permanent. But if you see errors consistently, or if you’re getting email from GWT regarding crawling issues, you should start working on repairing the issues you are seeing.

Let me know what errors you are seeing when using your Google Webmaster Tool.

What actions are you taking to remove your site errors?

If you find the information in this post useful, please share it with your friends and colleagues on Facebook, Twitter and Google Plus.

Was this helpful?

Thanks for your feedback!
Authored By
A Blogger, Author and a speaker! Harsh Agrawal is recognized as a leader in digital marketing and FinTech space. Fountainhead of ShoutMeLoud, and a Speaker at ASW, Hero Mindmine, Inorbit, IBM, India blockchain summit. Also, an award-winning blogger.

32 thoughts on “How to Fix Google Webmaster Tool Site Crawl Errors”

  1. Om Sehat

    Great sharing 😀

    From this article I found a solution to resolve errors in my webmaster.

  2. Mohammad Rizwan

    Hi, harish , i found DNS Error in websmster tool, Yellow Color, Please Tell me how to fix this, Thanks

  3. Ajay Borkar

    Hi Harsh,

    I am getting the below error :
    “Network unreachable: robots.txt unreachableWe were unable to crawl your Sitemap because we found a robots.txt file at the root of your site but were unable to download it. Please ensure that it is accessible or remove it completely”

    Request you to please advice..

    1. Harsh Agrawal

      Hey Ajay,

      Check that your hosting provider is not blocking Googlebot. If you have a firewall, make sure that its configuration is not blocking Google.

  4. shibsankar

    Hi, harsh I have problems with my webmaster tools it shows a message Sneaky mobile redirects detected. What is the actual problems and how solved it?

  5. balu alluri

    Hello! Harsh.
    I migrated my site from blogger to WordPress. After that Search Console showed no.of errors like Not Found, Soft Errors, Smartphone errors, feature phone errors. Hundreds of posts facing these errors, I want to know how to fix them.
    Mainly I want to fix ” not found ” Errors.
    After migrating from blogger to WordPress:
    When I type my site in google search engine on mobile it showing ?m=1 link. It is not updated. What can I do?

  6. Sunita Wadekar

    Hello Harsh hope you are doing good !! I am working as SEO Analyst and doing seo for one of my website. Few days ago in google search result when i was checking my keywords that time i found my website showing in search result but below of my site’s url “this site may be hacked” msg was displaying. Recently that message was removed by developer. But my keywords are badly affected only one keyword displaying on 3rd page and rest of the keywords not in 100. Please let me know hot to deal with this issue and how to regain my ranking. And another issue is that when i am trying to open my website to google result(chrome) browser means through keyword it opens for a while and closed automatically after a second. I want to know why this is happening with the website. Please suggest me a way so that i can remove these issue of my website.

    Thanks & Regards
    Sunita Wadekar

  7. Anak Banjarwangi

    Hi Agrawal, this article useful to me. And I am already got a new hosted faster. But, I found error 404 not found link in my error crawl webmaster. There is hundreds links, how to fix it automatically?

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top