20 common Technical SEO mistakes and how to fix them

20 common Technical SEO mistakes and how to fix them
5/5 - (1 vote)

It can be said that in my SEO (Search Engine Optimization) process, I have had countless technical checks on project websites.

Best SEO Tool team has noticed some Technical SEO mistakes that SEO-er usually encounter.

In your opinion:

  • What are the most common Technical SEO mistakes SEOs make?
  • Which factor has the greatest impact if well corrected?

In today’s post, I’ll cover all of these technical SEO mistakes. It also teaches you how to identify and edit them. Hope to help your self-learning process of web SEO more effectively.

If you master the factors for successful SEO combined with knowing how to restore your website after an algorithm update. At the same time, avoiding the following 20 technical SEO errors, surely the website’s ranking will be much better.

Let’s follow it now

1. Website Speed

According to Google, website speed is also considered a ranking factor on search engines. It directly affects the user’s onsite time to stay on the site.

Normally, a website should only load for at least 2-3 seconds. Users will not wait for your website to load while there are countless other websites on the Internet. Websites that load too slowly can affect revenue too. Why?

Simply, users will not stay on your website long enough to learn about your product/service. Obviously, the probability of them buying or using the service is not high.

Common Technical SEO mistakes in website speed optimization

  • The image size has not been properly optimized.
  • The web code written is not up to standard.
  • Creating too many plugins
  • Heavy Javascript and CSS

How to find technical SEO errors that slow down your website?

  • Check your website on Google through PageSpeed ​​Insights, GTMetrix, or Pingdom channels

How to optimize website loading speed?

  • Hire a dedicated staff with experience in this field
  • Make sure you have your staging domain installed so that site performance is not hindered
  • If possible, make sure you have upgraded PHP to PHP7 where you use WordPress or PHP CMS. This has a huge impact on website speed.

2. Not optimized for mobile user experience (UX)

The truth of optimizing UX on mobile

In the ranking process, the first step of the Google algorithms is to review and evaluate your website through the mobile version.

That’s not to say you ignore or simplify the user experience on the desktop. Because they want that if any changes take place, the users of the site will not be affected too much.

How to test mobile browser?

  • Use the Mobile-Friendly Test tool to see if your website is suitable for mobile viewers
  • Check if Googlebot on a smartphone crawls the website. Note that smartphones have not yet implemented all forms of the web.
  • Is your website responsive to different devices? If your site doesn’t work on mobile devices, find a way to fix it now
  • Is there any unusable content on your site? Check if all content loads normally or an error occurs again. Make sure you fully test all your mobile pages.

When starting the website audit process, you should check whether the website is penalized by google or not to take timely action!

How to fix website errors on mobile phones?

  • Understand how mobile affects server page load.
  • Focus on building a mobile page that’s impressive. Google prefers responsive websites and is their preferred option for mobile website delivery.
  • If you are currently running a standalone subdirectory, m.yourdomain.com will consider the potential impact of increased crawling on your server.
  • Choosing a template update that matches the look and feel of your website is not enough for each plugin. Consult more web developers to get a satisfactory template.
  • Set multiple breakpoint sizes of mobile phones, usually the maximum fixed width is 320px corresponding to the width of Iphone screen.
  • Tested on Iphone and even smartphones using Android operating system
  • If there is a piece of content that needs fixing, such as flash or other proprietary systems that don’t work on mobile browsers.
    Consider switching to HTML5 to make it easier to display on mobile. Google Web Designer will allow you to recreate the FLASH file in HTML.

3. URL structure problem

As your site grows, it’s easy to lose track of the URL structure and hierarchy. Poor structure makes it difficult for both users and bots to navigate. This will negatively impact your rankings.

  • Web structure and hierarchical issues
  • Not using proper directory and subfolder structure
  • URLs that have special characters, capital letters, or are not useful for user search intent.
Technical SEO mistakes - URL structure problem
Technical SEO mistakes – URL structure problem

How to find URL structure errors?

  • 404 errors, 302 redirects, problems with XML sitemaps are all signs that a website needs to be re-structured.
  • Conduct a full website crawl (using SiteBulb, DeepCrawl or Screaming Frog) and manually review for quality issues
  • Check the Google Search Console report (Crawl > Crawl Errors)
  • Test users, ask people to find content on your site, or make a test purchase – Use a UX testing service to document their experience

How to fix URL structure error?

  • Website level planning – I encourage you to use a parent-child directory structure
  • Make sure all the content is placed in the right folder or subfolder
  • Make sure your URLs are easy to read and make sense
  • Remove or merge any content that ranks for the same keyword
  • Try to limit the number of subfolders to no more than three levels

It sounds a bit complicated, but I guarantee that effective onpage SEO & URL optimization strategies in 2021 and the next few years will fully satisfy your wishes.

How to fix URL structure error?
How to fix URL structure error?

4. The site contains too much Thin Content

The truth is:

Google only wants to rank for pages with in-depth content that provide valuable information useful to users.

So, do not focus too much on writing content for SEO purposes, but write according to Google’s assessment.

A page with too much poor quality content can negatively affect your SEO, for a number of reasons:

  • Content that does not meet user needs can reduce conversion rates or customer reach.
  • Google’s algorithms highly value the content quality, reliability and linkability of the page
  • Too much low-quality content will reduce search engine crawl rates, index rates, and web traffic.
  • Instead of writing content for each keyword, gather words that have the same topic or certain correlation, and then put them into a more detailed, informative article.

How to find Thin Content error?

  • Scan the website for pages with a word count of less than 500 words.
  • Check Google Search Console to see if there are any manual notifications from Google.
  • Do not rank for keywords that you are using in content or keywords that show signs of ranking decline
  • Check the bounce rate and user time on the website – the higher the bounce rate, the lower the quality of the content.

How to fix Thin Content error?

  • Combine multiple keywords with the same topic into one article instead of writing articles for each word (the number of words is up to you but I think about 5 or 6 words is enough)
  • Focus on the pages where the content is most likely to engage with users – note adding video or audio, infographics or images – if you don’t have these, find them on Upwork, Fiverr or PPH.
  • Must put people’s needs first, what do they want? From there, create content articles that match those needs.

5. Technical SEO error is not optimized Meta Description

Meta Description is one of the factors that determine the rate of users clicking on your article.

There are 2 cases:

  • If you do not write Meta Description, Google will automatically take any content in the article to fill in the Meta Description you have not optimized.
  • If you write too long, your content will not be able to fully display on the search results table.

Of course, you will want to optimize the Meta Description as much as possible.

  • 1 Meta Description must summarize the main content of the article. (This part is a bit difficult, isn’t it? It’s easy to write long, but it’s not easy to write short)
  • It contains up to 120 characters so it can be optimized for both desktop and mobile interfaces. Faster right?
  • Meta Description is not necessarily stuffed with keywords that need SEO optimization.

How to recognize Meta Description errors?

  • Use Screaming Frog to check the number of characters of the Meta Description and the article is missing the Meta Description on the total website.
  • Is it too simple to check if the Meta Description is stuffed with too many keywords?

Because, you already have a list of Meta Description of each article on the website + keywords for each article. See and recognize immediately

How to fix Meta Description errors quickly?

  • Write a full Meta Description before publishing the article.
  • Add Meta Description for the entire missing article.

Note: Meta Description of each article must contain a maximum of less than 120 characters.

6. Technical SEO error is not optimized H1 / Title

It can be said that H1 / Title is the most important content, attracting users to visit your website. The title shows up right on the Google search rankings and the H1 is right in the “prime” position – the first line of the article.

Are you sure you understand the importance of H1/Title already?

Some common Technical SEO mistakes

  • H1/ Title is too long (H1 > 70 characters, Title > 65 characters), does not contain the main keyword and contains LSI Keywords.
  • H1 and Title are the same
  • Missing H1 or H1 not placed at the beginning of the article.
  • Titles of some blog posts on the website are duplicated.

How to detect H1/Title errors?

In fact, you can use the multi-tool Screaming Frog to check for most Opage SEO technical errors, such as H1 / Title unoptimized errors.

Or use the “allintitle: title name” structure to check if your existing titles are the same as the titles of other articles on the web.

Ex: Type allintitle: 20 common Technical SEO mistakes and how to fix them to check if there is a duplicate Title or not.

How to fix H1/Title error?

Depending on what Technical SEO mistakes you are making when optimizing H1 and Title to have a quick fix for this SEO technical error. This is actually very simple!

  • Based on the report of the Screaming Frog tool, you can know: Which articles are missing H1 or H1 and the title is the same to adjust.
  • Insert main keyword + LSI Keyword yourself into H1 and Title
  • Note the limited number of characters for the H1 and the Title section of the article.

7. Too many pieces of irrelevant content

In addition to adjusting the thin pages, you also have to make sure the content in them is related to each other. Irrelevant pages are not only unhelpful to users, but they also detract from other areas of your website that are performing at their best.

Who doesn’t want their website to only serve Google with the best content to increase credibility, authority and SEO power.

A few common cases

  • Create pages with low engagement
  • Let search engines crawl through non-SEO pages.

How to find these technical SEO errors?

  • Revisit your content strategies. Focus on creating more quality pages instead of trying to create lots of pages.
  • Check the statistics gathered from Google and see which pages are being indexed and crawled

How to fix irrelevant content errors?

  • You do not need to worry too much about the target when planning content. Instead of posting 6 posts as planned, you should focus more on adding value to the content.
  • For those pages that you don’t want Google to rank, add them to your robots.txt file. This way, Google will only see the best sides of your website.

8. Do not take advantage of Internal Links to create an affiliate network

Internal links help distribute the network of links on a website. Sites with little or irrelevant content often have less cross-links than links with lots of quality content.

Crosslinking articles and posts help Google better understand your site. In terms of technical SEO, the value that these factors bring is to help you build a website with a hierarchical structure, while improving keyword rankings. An up keyword can pull other keywords up.

Group keywords by topic
Group keywords by topic

How to find these technical SEO errors?

  • For the pages that you want to top, check which internal pages link to them. You can use the Google Analytics tool to check the Internal Links of the page.
  • Use Screaming Frog to conduct inlinks crawl.
  • You will know the problem yourself if you actively link to other pages on your website.
  • Do you add internal nofollow links through a plugin that applies to all links? Test the link code in the browser by inspecting the source code.
  • Use the same small amount of anchor text and links on your site.

How to fix these technical SEO errors?

  • For the pages that you want to get to the top, add more content using content already available on other pages on the website. Then, insert an Internal Link into the article.
  • Use data from Screaming Frog crawling your website to find more internal link building opportunities.
  • Do not over-stuff the number of links and keywords used to link. Let’s make it natural and in a certain order
  • Check your nofollow link rule in whatever plugin you’re using to manage links

9. Insufficient use of structured data

Using structured data (Schema Markup) will give your website great advantages such as:

  • Help enhance your appearance in search results through Schema markup. It helps your display results become more complete with more information.
  • Build a solid Entity network, strengthen links, increase your site’s relevancy with certain searches.

Therefore, if you do not use enough structural data, the website will lose some important advantages, Entity may become loose, etc.

How to fix:

  • First, check if your website has Schema integration by: go to the structured data testing tool, then enter your URL, run the test. Google will return a table that provides information about the structural data contained in the URL. Based on this, you can evaluate whether you use structured data or not, have used enough structured data or not?
  • If your website has not been optimized for structured data, then proceed to set up Schema for your website. Regarding Schema settings, I have a very detailed post, you can check it out to understand how to implement it! I will leave the link right below.

10. User goes to wrong language page

In 2011, Google introduced the hreflang tag to brands engaged in global SEO to help them improve their user experience.

Accordingly, the Hreflang tag includes many hosting sites on local IPs, which in turn, connect to local search engines.

The tag is supposed to signal to Google the correct web page to offer to users based on search language or location.

The code of the Hreflang tag displays as follows:

<link rel=”alternate” href=”http://example.com” hreflang=”en-us” />

However, for readers to find the article in their native language is not easy, but on the contrary, requires a reasonable and extremely detailed code. This is by no means simple.

Therefore, it is not uncommon for link language failures to occur quite often.


There are several ways to solve problems with Hreflang tags:

  • Make sure the card’s code is programmed correctly. You can use Aleyda Solis’ hreflang Tags Generator Tool to perform this check.
  • When updating a page or creating a page redirect, update the code on all pages that reference/link to that page.

Above are the most common basic Onpage SEO technical errors. However, depending on the specific field, the risk of making technical SEO Onpage errors will also be different. Especially for medium and large e-commerce websites.

11. Failure to strictly manage 404 errors

Among the technical SEO errors, this is the error that e-commerce sites often encounter.

Specifically, when a product is discarded or expired, it will easily be forgotten or encounter a 404 error.

Failure to strictly manage 404 errors
Failure to strictly manage 404 errors

Although 404 errors can limit your data collection, don’t worry too much. They will not adversely affect the SEO process.

However, 404 pages are really problematic when they:

  • Get massive amounts of internal traffic or from organic search.
  • There are external links pointing to the page.
  • There are internal links pointing to the page.
  • Large number of pages on a larger website.
  • The page is shared on social networking sites or on other websites.

The way to fix the above problems is…

Set up a 301 redirect from a deleted page to another linked page on your website.

This helps preserve some of the power of SEO resources and ensures seamless navigation for users.

How to find out 404 pages?

  • Craw the entire website (via SiteBulb, DeepCrawl or Screaming Frog) to find 404 pages.
  • Check the Google Search Console report (formerly Google Webmaster Tools)
  • You can also refer to how to fix it through the following video tutorial:

How to fix 404 errors?

  • Analyze the 404 error list on your website.
  • Cross-check URLs with Google Analytics (create internal links) to see which pages are getting traffic.
  • Cross-check URLs with Google Search Console to see which pages get links to from external sites.
  • For high-value pages, identify an existing page on your site that is most relevant to the deleted page.
  • Set up a server-side 301 redirect from the 404 page to the current page you defined. If you’re going to use a 4XX page, make sure it actually works so it doesn’t affect the user experience.

12. Problems when moving website (Redirect)

When creating a new website, a new page or new design changes, technical problems need to be resolved quickly.

Some common Technical SEO mistakes are often seen

  • Use 302 (short-term) redirects instead of 301 (long-term) redirects. Although recently, Google has suggested that 302 redirects have the ability to transmit SEO power. But based on internal data gathered, I think it’s still safer to use 301 redirects.

Many of you use code 302 to redirect HTTP to HTTPS, please check and edit, otherwise you will be adversely affected to the whole page SEO!

  • Incorrect HTTPS settings on the site. In particular, not redirecting a site’s HTTP to HTTPS can cause problems with duplicate pages.
  • Do not 301 transfer from the old site to the new site. This problem usually occurs when using a plugin for 301 redirects. 301 redirects should always be set up through the site cPanel.
  • Leave the old tags on the site from the staging domain. (canonical tags, NOINDEX tags, … – tags that hinder the indexing process of pages on the staging domain).
  • Indexing staging domains: Contrary to the above scenario, it occurs when you incorrectly tag staging domains (or subdomains) with the intent of noindexing them from the SERPs.
  • Create a “redirect chain” during the cleanup of old sites. In other words, not correctly identifying previously redirected pages, but just doing a new set of redirects.
  • Do not save the www or non-www of the site in the .htaccess file. Cause 2 or more cases when Google indexes your website, duplicate pages are also likely to be indexed.

How to recognize these technical SEO errors?

  • Conduct a full crawl of the website (using SiteBulb, DeepCrawl or Screaming Frog) to get the required data.

How to fix errors when moving website

Pay close attention to check:

  • 3 times to make sure you’ve 301 redirected properly.
  • See if your 301 and 302 redirects are redirecting to the correct page.
  • The canonical tag, make sure you put it in the correct position.
  • If you have to choose between canonical 1 page or 301 redirect that page, then 301 redirect will obviously be safer and more effective.
  • Check your code to make sure you have removed all NOINDEX tags. Don’t ignore the plugins option as it is possible that the website developers have coded NOINDEX as the header.
  • Update robots.txt . file
  • Check and update the .htaccess file
How to fix errors when moving website
How to fix errors when moving website

13. XML Sitemap Error

XML Sitemap is responsible for listing the URLs on the website that you want search engines to crawl and index them. You are allowed to add to the pages information such as:

When was the last update?
Page level change?
The importance of that page when linking to other URLs in the website.
While Google admits to ignoring a lot of this information, it’s important to optimize properly, especially on large sites with complex structures.

Sitemaps are especially beneficial on sites like:

Some locations on that website are not available through the browseable interface
Webmasters use premium content that search engines can’t handle, like: Ajax, Silverlight or Flash
Site size is so large that web crawlers ignore some recently updated content
When sites have a large number of isolated pages or do not link well together
Misuse of “crawl budget” on unimportant pages. If so, use noindex to immediately block this crawl

How to find these errors?
Submit your Sitemap to Google Search Console.
If you implement SEO on Bing, remember to use Bing webmaster tools when submitting the Sitemap!
Check Sitemap error through steps: Crawl (Crawl) -> Sitemaps -> Sitemap Error
Check the log files to see when your Sitemap was last accessed
How to fix those errors?
Make sure your XML Sitemap is connected to Google Search Console
Conduct a server log analysis to understand how often Google crawls your sitemap. There are many other things that I will introduce when using my server log files later.
Google will show you issues and examples of them so you can correct them.
If you are using a plugin to generate the Sitemap make sure it is a new plugin and the file it generates works fine.
If you don’t want to use Excel to check your server logs – you can use a server log analysis tool like Logz.io, Greylog, SEOlyzer or Loggly to see how XML Sitemaps are used.
14. Error about robots.txt . file
The robots.txt file is responsible for tracking the search engine’s access to your website.

Many people believe that the robots.txt file is the cause of preventing the indexing process of the website.

However, most problems with robots txt usually arise from not changing the file when moving the website or entering the wrong syntax, like the example below:
How do I know the robots.txt file has an error?
Check your website stats
Test Google Search Console reports (crawl > robots.txt tester)
How to fix errors for robots.txt file?
Check out the Google Search Console report. This will help validate your file
Make sure the pages/folders you do NOT want to be crawled are in your robots.txt
Make sure you don’t block any important folders (JS, CSS, 404, etc.)
15. Abuse of Canonical tags
The Canonical tag is a part of HTML that helps search engines decipher duplicate content pieces. If there are two pages that are the same, you can use this tag to tell search engines which page you want to show in search results.

If your website runs on a CMS like WordPress or Shopify, you can simply use a plugin (preferably Yoast) to install the Canonical tag.

I discovered that some websites overuse the Canonical tag for things like:

Use Canonical tags to point to unrelated pages
Use Canonical tags to point to 404 pages
Combine Canonical tags together
E-commerce and “faceted navigation”
Through CMS create 2 versions of the same page.
It’s really important to notify search engines of pages with inappropriate content on your website, but this has a significant impact on the indexing process as well as the website ranking order.

How to identify Canonical tag errors?
Proceed to crawl the entire website through DeepCrawl
Compare “Canonical link elements” with the original URL to see which pages are using Canonical tags
How to fix Canonical tag errors?
Check the pages again to determine if the Canonical tags are pointing to the wrong page
In addition, you should also check the entire content to know more about pages with similar content or find out if there are any more pages that need the Canonical tag.
16. Misuse of robots . tags
Similar to the robots.txt file, robots tags can also be used in the header code. From there arise more potential problems, especially robots tags used at the file level and on individual pages. In some cases, I’ve seen multiple robots tags appear on the same page.

This problem leaves Google vulnerable and will likely be a barrier to well-optimized pages from having a chance to rank.
How to recognize the problem robots tag?
Check the source code in the browser to see if any robots tag was added more than once.
Check the syntax and make sure you are not confusing the nofollow link with the nofollow robot tag
How to fix robots tag error?
Use Yoast SEO to learn how to effectively manage your robots tag activities.
You can also use a plugin to control the robot’s operation
Make sure you edit the templates in which the robots tag is placed following these access steps:
Appearance > Themes > Edit > header.php
Add the Nofollow directives tool to the robots.txt file so you don’t have to search from file to file.

17. Not managing Crawl Budget well
Google cannot crawl all the content on the internet at once. To save time, Googlebot has the ability to allocate to web pages depending on certain factors.

The site with higher authority will have a larger Crawl Budget. These sites are crawled and indexed more content than lower authority sites (those with fewer pages and fewer visits).
How to recognize these errors?
Learn about the amount of data collected in Google Search Console by:
Go to Search Console > Enter your domain name > Select Crawl > select Crawl Stats.

Use server logs to find out which part of your site Googlebot spends the most time on. This will tell you if Googlebot is on the right page.
How to fix these errors?
Minimize the number of errors on your website.
Noindex pages that you don’t want Google to see.
Reduce the redirect chain by finding all the links that link to the page itself and updating all the links to the new landing page
Fix some of the other issues I discussed above to help increase your crawl budget or focus Crawl Budget on the right content over the long term
Particularly for e-commerce websites, if you have not changed the actual content on the page, you should not block parameter tags used for multi-dimensional navigation purposes.
18. Not optimized HTTPS
In October 2017, Google issued a “not secure” warning in Chrome every time a user visited HTTP websites.

Instead, Google encourages users to switch to HTTPS websites to ensure more effective security.

To check if your website is HTTPS, enter your domain name in Google Chrome. If you see a “secure” message, then your site is indeed HTTPS.

Conversely, if your site is not secure, when you enter your domain name into Google Chrome, it will show a gray background – or even worse, a red background with a “not secure” warning. This can cause users to immediately leave your site.

Some ways to solve this problem:

To convert your site to HTTPS, you need an SSL certificate from a Certificate Authority.
After you purchase and install your certificate, your website will be secure.
19. Multiple URL versions for homepage
There are a few cases where multiple URLs when accessing them all return the same website. For example, “yourwebsite.com” and “www.yourwebsite.com” both have the destination website yourwebsite.

This may seem convenient, but it actually reduces your site’s visibility in search.

Some ways to solve this problem:

First, check if different versions of the URL successfully redirect to a canonical URL, like: HTTPS, HTTP, or “www.yourwebsite.com/home.html.”… versions, etc.? K. Another way is to use “site:youritename.com” to determine which pages are being indexed and whether they come from multiple URL versions.
If you discover multiple URL versions for your homepage, set up 301 redirects or ask your web developer to set them up for you. You should also install the Canonical domain of your website in Google Search Console.
20. Missing alt tags for images
The alt tag makes it easier for search engines to index your page by telling the bot what the image content in each article is. From there, increase the SEO value of the page as well as improve the user experience on the web.

Therefore, the lack of clear images or alt tags are unfortunate shortcomings that need to be corrected immediately.

How to fix alt tag issues for images:

Regularly run an SEO audit program to monitor image content; and easily manage and update image alt tags on your website.

Through research and discussion, I realized that people, including longtime SEOers, are still making these 7 basic Technical SEO mistakes when implementing SEO. These Technical SEO mistakes can make your website flat forever on pages 2, 3, … and not bring conversion value to the business. Are you curious?

You think you’re not one of them? Then just scroll through!

But I’m sure you’ll come back to this article when you run into problems. So why don’t you read it first to avoid getting caught? “Prevention is better than cure”, right? Try reading!

Epilogue
In the article, I have introduced to you 20 common Technical SEO mistakes that every SEOer must have encountered in the search engine optimization process. At the same time, it is a way to detect errors and effective remedies for each case.

Hopefully, after reading the article, you will recognize these technical errors. In order to avoid unnecessary risks to the website. Above all, it will provide a timely and correct repair form to maintain the operation and development of the website.

Good luck!