Skip to main content
All Collections
Site depth: how to spot problems and correct them?
Site depth: how to spot problems and correct them?

A website has a large number of pages: it’s essential to monitor its depth and correct the associated errors.

Z
Written by Zyad Soummari
Updated over a year ago

A site’s depth refers to the maximum number of clicks required to reach all the pages.

It’s recommended to have a maximum depth of 4. Beyond that, there may be several consequences on the website’s health:

  • The website’s crawlability will be endangered and the robot will pass less and less often. Indeed, Googlebot has resources allocated to a website’s crawl according to several criteria, such as the content’s quality, and the server’s performance. The click depth is an important criterion.

  • Position’s loss or low visibility: in addition to on-page and technical optimization criteria, a page’s authority has an important part in its positioning (the weight will depend on the themes). This authority can be acquired, either via external links (backlinks) directly, or via internal links’ distribution (from Home, parent categories). However, if the page is very deep, it receives a "very diluted" authority and so not very efficient ==> its position stagnates, or, if the page gets deeper into the site over time, its position will drop.

So how do you measure the pages’ depth and analyse the results?

We recommend that you use a crawler. Most crawlers will list pages and their depth. This is what we do when we carry out an audit.

Maximum click depth

First, check out the maximum click depth:

  • If it’s less than or equal to 4: everything is good, you’ve got no problem with depth.

  • If it’s more than 4: go to the next point.

The percentage of URLs affected by a depth >4

Do deep pages cover 20-40 or 70% of the pages on your website? The more pages are concerned, the higher the impact and therefore the priority of dealing with this point.

Now go through the concerned pages’ list, and then check in which case you are:

  • An issue with pagination

If your website contains several products per category, the pagination can generate a very significant depth. Let's imagine that we’ve got 30 pages produced within the same category, if we have an incremental pagination such as "1, 2, 3, 4… 29, 30" or "previous, next": the website will have a minimum depth of 31.

==> Locate those pages in the list, then modify the pagination set up.

These can be both paginated listing pages and product pages, since those pages come right after the listings, they are bound to be deeper.

The same reasoning applies for blogging: let's say we’ve got a WordPress blog with "previous / next" incremental pagination, and we post about 5 posts per week ==> the depth is going to continue to increase. An article that was very successful in June 2017 will be much less successful in 2020: its position will fall, as the links it receives will become deep and diluted.

==> Locate those pages in the list, then modify the pagination set up.

  • Other cause

If it is not a pagination problem, then it’s the website’s structure itself that must be reviewed: a significant depth indicates that there are many levels. In this case, we must wonder if the intermediate categories are useful for SEO, and replace them with dynamic filters for example to reduce the number of levels. A significant depth is also an ergonomics problem for browsing the site.

And there you go! Now you know everything about the best practices in terms of website’s depth!

Learn more on the same topic:

Have you found your answer?

Did this answer your question?