All Collections
The robots.txt file: audit and problem solving
The robots.txt file: audit and problem solving

Learn the steps to find and fix problems with your robots.txt

C
Written by Celina
Updated over a week ago

Do you want to analyze a website’s robots.txt? We’re going to explain below what to look for and what corrections to make.

If you are still not sure what this robots.txt file is for and how to implement it, please refer to this article.

Your website does not have robots.txt

It is quite possible that a website does not have one. To verify, just search on a browser adding at the end of your domain URL "/robots.txt". Also check for it for each subdomain.

If you don't have robots.txt:

  • Do you need it? Check that you don't have any low value pages that you wouldn’t want to index. For example: shopping cart, your internal search engine’s search pages, etc.

  • If you need to, create the file according to the guidelines above.

Your website has a robots.txt

Open the file and simply check the blocked pages:

  • If pages are blocked when they should not be: remove them.

  • If there are missing pages that must be blocked: add them.

  • If the blocked pages are exactly the ones you inteded: that's fine, there is nothing to do.

Now you know everything about robots.txt’s analysis!

Check out similar articles:

The Google Search Console coverage report:

Have you found your answer?

Did this answer your question?