Crawl Errors Better For Your Website or Not? - Blurbpoint
Sanket Patel

Posted by Sanket Patel

March 20, 2012

SEO 8 min read

Crawl Errors Better For Your Website or Not?

Recently I have seen the bug in webmasters tools

One of the well-known features in Webmaster Tools is crawl errors. Now, webmasters are finding some of the necessary enhancements that make Crawl Errors more useful. Nowadays, many new types of errors are detected and reported. Errors are to be divided into two parts such as site errors and URL errors in order to make sense of the new data.

Site Errors:



Site errors are also has sub-category like: DNS, Server Connectivity and Robots.txt Fetch. In DNS, different types of errors like DNS lookup timeout, domain name not found, and DNS error are included while Server Connectivity errors includes network unreachable, no response, connection refused, and connection reset. Last one is Robots.txt Fetch errors that are particular to the robots.txt file. When user is trying to access file with a server error at that time, there is no other ways to know details about existence of robots.txt file. User can have to stop the crawl until he will get an error at the time of attempting to obtain it.

URL errors:


Various URL errors are:

Server error- Server error is 5xx errors like 503 for server maintenance.

Not found- The URL of type of errors are returned onto a 404 or 410 errors.

Access denied- In this type of errors, you will find 401, 403, or 407 response code. This type of URL prompts for a login, which is not considered as an error. In order to enhance crawl efficiency, you can block these URLs from crawling.

Soft 404- this type of error is noticing as returning an error page, but doesn’t give any response code like 404. Basically, they have response code of200 or 301/302. Many times, this error pages can reach user to search result, which is not appropriate for ideal searcher experience. Crawl efficiency can be damaged by error pages that don’t return a 404 as Googlebot can end up crawling these pages in place of valid pages you want indexed.

Not followed- In such error, URLs redirects that Googlebot had trouble crawling due to redirect loop. It doesn’t give you the details of the redirect error, but this URL returned a 301 or 302.

Other – this includes all other remaining errors.

Categories

Let's Talk