All Collections
URLs: spot errors and correct them
URLs: spot errors and correct them

How to write your URLs to make them SEO-friendly?

Z
Written by Zyad Soummari
Updated over a week ago

URLs provide information to search engines about a page's content, its context and its target. They must be customized in order to provide a perfect understanding of the website’s content and the target. Each URL must in fact meet the following criteria:

  • Main keyword

  • 100 characters

  • Clear and readable

  • “-” dashes allowed

  • Avoid dynamic URLs, “_” underscores, special characters, symbols, accents, capital letters and hierarchies that are too deep

  • Do not contain capital letters, accents, special characters, and punctuation (apostrophes…)

  • Do not contain a “?” (unless it refers to the right content with the acronym "=")

Check and correct URL errors

A crawler can help you spot URL issues more quickly. In all cases, if you have the inventory of your website’s pages, the 3 verification points below will then be possible.

URLs containing errors are also provided in technical audits carried by our team.

CASE #1: size - URLs that are too long

Make sure that your URLs are 100 characters maximum.

In the case where they exceed this limit:

  1. Check the current traffic brought by this page, as well as its interest for your business:

  2. If it’s a very important page and that its length remains "reasonable" (110-120 approximately) => it’s better to keep the URL as it is in order not to penalize its traffic and the conversion.

  3. If it’s a page with a very long URL (more than 120 characters) and which currently brings little traffic, or traffic that you’re ready to "sacrifice" momentarily => the best is then to modify the URL and to do a “303 redirect” from the old URL to the new one. Most CMS (like WordPress for example) do this action automatically. Be careful, in that case, make sure to modify the links pointing to the old URL so that they now point out to the new one without going through the redirection. Likewise, for Netlinking, if backlinks pointed to this page, the ideal would be to contact these websites in order to modify the link’s URL.

Why is there a "sacrifice" of traffic in the latter case?

When setting up a redirect, it takes a little time for Google to understand the change and consider the new page. There is therefore an inevitable dip in traffic linked to a position’s loss, which generally lasts a week, but it all depends on your website’s usual crawl frequency / quality.

In addition to that, if we’re not careful enough to modify the internal and external links (backlinks) with the new URL, it is possible that the modified page (new URL) will not return to its previous position.

CASE #2: special characters

Make sure your URLs don't have the following special characters: “_” underscore, special characters, symbols, accents, uppercase, punctuation.

The authorized character is the hyphen "-".

Analyze pages with special characters in the same way as above for URLs that are too long.

CASE #3: duplicate content

This is DUST (Different URL Same Text): two different URLs with 100% identical content.

Some crawlers can spot them, and we also do so in the SmartKeyword audit.

We can see that the home page exists twice under 2 different URLS: mywebsite.com/fr and mywebsite.com/fr/dashboard.

The problems that this causes:

  1. Duplicate content in itself: it complicates the website’s crawl by the robots, for no reason.

  2. Impact on the internal linking: URLs’ duplication causes confusion. Indeed, if while writing an article, the editor places a link towards the "wrong" URL rather than the right one, the right URL will have an underexploited in-link because it will not receive the links which it should actually receive.

Note that even if we add 301 redirects from the "wrong" URLs to the "right" ones, this will not solve the problem. In fact, the 301s will reduce internal links quality, the SEO juice’s transmission (PageRank) and, in consequence, devalue the internal linking.

The most frequent example is when a page exists under 2 URLs: one ending with "/" and the other not.

How to correct these duplicates:

  1. Check that the canonical URL tags point to a single URL, which will be called the "correct" URL. NB: on some CMS, it's an automatic action.

  2. For duplicate URLs, you have to set up 301 redirects from the "wrong" URLs to the "right" one.

  3. Review its mesh to ensure that we only link to the "right" pages, so as not to add redirects to our internal mesh.

Note that these duplicate cases can also be identified using Title tags for example (duplicate tags).

Now URL analysis has no more secrets for you! 😊

Check out similar topics:

Have you found your answer?

Did this answer your question?