Skip to main content
All Collections
Google Search Console: identify and analyze “valid with warnings” URLs
Google Search Console: identify and analyze “valid with warnings” URLs

How to understand and analyze “valid with warnings” URLs within the Google Search Console coverage report?

C
Written by Celina
Updated over a year ago

In Google Search Console (GSC), within the coverage report, accessible on the left column under "coverage", we can take stock of the website’s indexing status. It’s a goldmine of information about a website’s good crawlability and indexing.

If you do not have a GSC account yet, you will first need to create one and validate your property.

The coverage report is as follows:

It contains 4 sections: Error, Valid with warnings, Valid and Excluded.

We recommend that you start with the "Valid" section in green.

Then, in a second step, the section that we recommend you to consult is "Valid with warnings" in orange.

This section is now associated with a single problem: indexed despite being blocked by the robots.txt file.

Warning’s explanation

The robots.txt is not a deindexing tool but a blocking tool. It is possible that certain pages are still visible to Google if a third-party website links to them.

Interpretation and correction

Click on the "valid with warnings" section:

Then, click on the "Details" subsection to display detailed lists:

  • If these are pages that need to be indexed: remove them from robots.txt as soon as possible to allow indexing.

  • Otherwise, in this case, you must remove these pages from robots.txt, deindex them properly, and then put them back in robots.txt.

The method is detailed here: deindex low added value pages.

The coverage reports’ other sections:

Check out similar articles:

Have you found your answer?

Did this answer your question?