All Collections
Google Search Console: identify and analyze wrong URLs
Google Search Console: identify and analyze wrong URLs

How to understand and analyze URLs with errors on the Google Search Console coverage report?

Z
Written by Zyad Soummari
Updated over a week ago

In Google Search Console (GSC), within the coverage report, accessible on the left column under "coverage", we can take stock of a website’s indexing status. It’s an information goldmine on your content’s crawlability and indexing.

If you do not have a GSC account yet, you will first need to create it and validate your property.

The coverage report is as follows:

It contains 4 sections: Error, Valid with warnings, Valid and Excluded.

We recommend that you start with the "Valid" section in green.

Below we’ll look at the 3rd step to check : "Error" in red.

This is the section listing the URLs that Google has not indexed because they have errors. Unlike the "Excluded" section, these are URLs that you have chosen to send to Google via a sitemap, which is why it warns you via this error section.

Click on the red section "Error":

Then, you can click on each error to have the concerned URLs’ list.

Technical errors

Let us first list the errors of the "technical" type:

  • Server Error (5xx): the server did not respond to an apparently valid request.

  • Redirection error: 301/302 redirect does not work.

  • Sent URL appears to be a "soft 404" error: you submitted this page to be indexed, but the server returned what appears to be a "soft 404" error.

  • Sent URL returns unauthorized request (401): you submitted this page to be indexed, but Google received a 401 (unauthorized access) response.

  • Sent URL not found (404): you sent a URL to be indexed, but it does not exist.

  • The URL sent contains a crawl error: you submitted this page to be indexed, and Google detected an unspecified crawl error that does not match any of the other reasons.

For all of those errors: correct the error if the page must be indexed, or delete it from the sitemap and internal linking.

Indexing errors

  • Sent URL blocked by the robots.txt file: this page is blocked by the robots.txt file, and sent by the XML sitemap at the same time. Remove it from robots.txt or XML sitemap depending on whether you want to index it or not.

  • Sent URL designated as "noindex": you submitted this page to be indexed, but it contains a "noindex" directive in a meta tag or HTTP header. If you want this page to be indexed, you must remove this tag or the HTTP header, otherwise it must be removed from the sitemap.

The coverage report’s other sections:

Check out similar articles:

Have you found your answer?

Did this answer your question?