The 5 Most Common Technical SEO Issues & How to Fix Them: Part 2

In case you missed Part 1 of The 5 Most Common Technical SEO Issues & How to Fix Them, here is the link.

4. Multiple versions of your site

This issue is the most common, yet is the most easily avoidable mistake. Having multiple versions of your site will confuse the search engines, which will ultimately lead to indexation problems. This issue not only causes issues for search engines, but will also confuse users.

For Example, below is a list of 4 different urls of the same home page:



In this particular example, search engines will consider all four of these urls as 4 different pages with the same content, which will ultimately cause a duplicate content issue.

In order to resolve such an issue, the first thing you should do is select a preferred domain in Google search console. Then check out the listed domain variations, and see if they are automatically redirecting the other variations of your website to your master domain.

If you realize they are not automatically redirecting, you can take below actions:

1. Use 301 redirects (NOT 302 redirects) on all the duplicate website variations.

2. If your CMS is not redirecting the upper-case URLs to lowercase URLs, include special directives.

3. You can also specify the preferred version of your website using a canonical link or noindex follow tags.

4. Make sure that all the internal links are using your preferred domain.

5. URL parameters

E-commerce websites and publisher sites typically have a lot of URLs with suffixes that begin with “/?” and contain “=”. These suffixes are used to provide similar page content in different formats.


In simple terms, these suffixes allow the users to filter their search within the site (from highest to lowest price, average customer ratings, etc.) But, if not managed correctly, the same URL parameter that provides convenience to the users can be damaging to the site.

If you find your site is generating these types of urls, try the following:

1. Direct Google to avoid crawling these duplicate pages via Robots.txt

5. If you have a small website, you can manually remove the culprit URLs from the search console.

But what if your duplicate pages (filter pages) have already earned backlinks?

Consider adding Noindex, follow tags to these pages (example below):
into your page’s HTMLsection.

This “Follow” command tells Google to pass the earned link juice to other pages linked from this page without actually indexing the duplicate page.

Let the Experts Help

Avoiding these five common technical mistakes can mean all the difference when it comes to your website rankings. If you are finding that determining or resolving these issues is too complex to handle by yourself, let the SEO experts at InterActive Circle help!

To receive a complimentary website audit or to learn more about how IAC can help you resolve your technical SEO issues, call us at (612)238-1466.