Diagnose the technical SEO problems and fix them to improve website’s performance
Doing SEO does not just mean creating optimized content. That’s just one (important) part of getting a page to rank zero in Google search results. If you feel like your efforts are not paying off after generating enough content, it’s time you tried to fix some of the technical SEO issues that may be causing these problems.
Analyzing websites and performing SEO analysis is one of the tasks that I carry out the most every year. I saw technical SEO problems in many websites and they ranked on Google’s first page after solving the issues. Today I am going to present the most common technical SEO problems. When they are solved, your website will perform better and generate an improvement in organic traffic.
Find and fix these technical SEO problems
Here, I have discussed the common technical problems that we always don’t care about. These issues can damage your website. Let’s discover them below and fix them soon.
The website takes more than two seconds to load is a page that is generating a bad experience. Google reduces the number of visitors, which means that the possibility of indexing new pages is reduced.
You would be working without achieving the objective because a website that does not appear in Google does not generate traffic. If traffic is not generated, your mission will fail. Now, there are many factors that impact the loading of a website. For example, if a website contains many large images, it is possible that the slow loading is due to that.
How to solve the problem?
PageSpeed is a very useful tool from Google to help websites determine which aspects are complicating loading. Once you enter the link of your page in this tool and it analyzes it, it offers a series of aspects to correct:
- The number of internal redirects
- Enable file compression so that the browser can handle them better and display them faster in response to the user’s request
- Reduce server response time
These plugins can help you to improve page speed:
- Autoptimize: It is complete and what makes it even better is that you can minify, compress and combine JS, CSS and HTML so that JS and CSS show at the end of HTML and load before.
- W3 Total Cache: It is one of the best known because, with it, you can not only minify but also take care of your cache. The problem is that his language is not made for beginners, so the assistance of a specialist is necessary.
Minify the CSS and JS codes to improve page speed:
- Refresh-sf.com: It allows you to minify JS, HTML and CSS at the same time.
After compressing the codes, check your website’s loading speed.
Core Web Vitals
The new metrics measure user experience on a site and that influence SEO positioning in organic searches. The three factors LCP, FID and CLS will be evaluated on all websites by Google. If you detect a bad score, you will conclude that we offer a bad user experience.
The Largest Contentful Paint refers to the total loading time of the content. In the range of 2.5-4 seconds, it recommends making improvements and everything that is above 4 seconds must be addressed and modified urgently.
First Input Delay basically measures the ability of a website to interact quickly. In the range of 0.1-0.3 seconds, Google will suggest improvements and everything above 0.3 seconds is considered really negative, penalizing the web positioning of that page.
Cumulative Layout Shift refers to the level of visual stability of an online site. This is probably the most abstract term of the three, so we introduce you to a little explanation. Google considers all CLS correct if it is below a score of 0.1. For above 0.25, we can be talking about pages that need a drastic improvement in your web design.
How to check and fix it?
Google presents page details on the Search Console so you can understand the Core Web Vitals result. Use Google page speed to detect the page’s problem and fix them. Also, Chrome dev tools help to see the metrics in real-time. If you want automated auditing on core web vitals, try the Lighthouse to check your website’s issues.
By improving your website loading speed and solving detected issues, you can easily fix the core web vitals. Focus on media files and lazy loads of your website. You can easily optimize them using plugins.
Check redesign of the website
In many cases that I analyze, I find that after a redesign of the web, it has had a drop in organic traffic. Sometimes it drops by around 80% and at other times the traffic drops by 10% or 20%. New website designs are a double-edged sword at the SEO level. If not done right, it can cause a traffic “crash” and take months to recover. It is often forgotten to perform the corresponding 301 redirects from the old URLs to the new ones. The link juice is lost. In these cases, the website’s ranking falls and with it, a large number of visits. Organic traffic can also drop when a lot of text is removed from main pages or categories of topics or products because the new web design is more minimalist.
How to find it?
It can be easily checked through Google Analytics. If the date on which the website was redesigned and the drop in organic traffic coincide, we will know that we have suffered from this SEO problem.
How to fix it?
In many cases that I have analyzed, the old URLs have usually been deleted. Therefore 301 redirects cannot be created. In other cases, where a CMS has been used, such as WordPress. Sometimes they have been left active but not visible by implementing a simple redirect plugin. So, we can get back the organic traffic that we had in a few weeks.
Use Redirection plugin:
With this plugin, you can create 301 redirects very easily.
If the drop in traffic could be because the text has been removed from the main pages or categories, and the HTML text ratio is very unbalanced, we must study how to put the relevant text back on them.
Let’s learn a little more about this:
- Text Ratio: is the percentage of plain text in a specific URL.
- HTML ratio: is the percentage of CSS code, images, and tags.
What is the ideal percentage between text and code on a web page?
It is recommended that this ratio be 30/70. That is, 30% of the page’s content should be plain text and the rest should be add-ons such as images, multimedia, script, CSS.
Can google penalize having a low HTML text ratio?
You can do it perfectly in the form of loss of organic positions. I consider that it is an important part that must be looked at within an SEO audit. Remember that for Google, it is easier to track text than images. Even today, through voice searches, google relies on tracking the keywords that the user has indicated by voice.
Studies show that almost 24% of existing web pages are not compatible with mobile devices. Google has officially declared in 2018 that it will classify each web page first in its mobile version and then in the PC version. This indicates that – mobile first. Do not confuse “responsive web design” with a “mobile web design, “as there are differences.
Here are some tips on mobile web design:
- Don’t link too closely together.
- If the text is too small, there is a 95% chance of incompatibility with mobile devices.
- If the content text does not fit the window, there is a 55% chance of not passing the test.
- It is a bad idea to lock CSS and JS files in robots.txt.
- Lighter and faster websites are created.
How to check mobile-friendliness?
To know a website is compatible with mobile devices, use the tools:
How to fix it?
The tools give you information about which resources you should optimize. You have to try to get the best possible score in them. If you don’t know how to do it, you will have to contact a web developer. In the case of updating the PHP of the web to a PHP 7, this option must be implemented to improve the website’s loading speed.
The SSL certificate has little influence on SEO, perhaps a 0.5% total, which is very little. HTTPS is a communication protocol on the Internet that offers safer browsing, improving user data integrity and confidentiality. Google managed to switch many Websites around the world to this protocol rapidly. Suppose a user were to enter a website from Google or any other means and found the insecure warning. They could be suspicious and go where they came from. A high bounce rate means a bad user experience.
How to get it?
The only formal requirement to install an SSL certificate on your company’s website is that the server where you host your website. It allows the installation of SSL certificates.
A website with an SSL certificate will not make you achieve better Google positions, but it does generate trust in its visitors.
A broken link is a link that takes you to a page that is not working. It is what we know as a 404 error. The page is restricted to a certain audience, such as a country. The entire website has been removed or moved to some other domain without being redirected. There was a change of structure in the permanent links (more common in internal links). Links to content (PDF, Word, videos, etc.) that have been deleted or are no longer available (for example, Dropbox). These are the reasons for the broken link.
How to check it?
If you had to check every link on your website, you would surely send me where I came from: it is a tedious and long task to check every link in every article, page, and even inbound links.
Webmaster Tools will not give us a list of broken links but of 404 errors on your website. To do this, you have to go to the “Coverage” section and analyze all the content and errors. You just have to look for the 404 errors and you will already have all the broken links on your website.
This solution is much more comfortable and provides more information. You just need the URL and click the yellow button “Find broken links.”
- Distinct broken links
- List with all links, with the number of links to said link
- All that remains is to give approval and wait.
Screaming Frog SEO Spider Tool
If you have read or taken courses on niche SEO, you have probably heard or been taught this software to analyze keywords. Well, it also works to check broken links.
Once installed, we are going to analyze a web page. In the main bar, type your URL and click “Start”. Wait for the analysis to finish. When you’re done, on the top bar, find and click the one that says “Response Codes.” A drop-down bar will open, click on “Client Error (4xx)”. Analyze the broken links that it offers you because it scans your ENTIRE site: images, links, resources, 4xx errors, redirects, etc.
How to fix it?
After detecting the broken links:
- Make a list of them.
- Enter your website and remove the links through the detected source.
- Replace it using a different source that addresses the same topic you wish to refer to.
Creating content to drive traffic sometimes comes with an added problem. Poor content structuring brings positioning problems. A bad structure of the web or categorization affects both small webs and large web pages. This I have verified in many of the technical SEO audits that I have carried out. Time must be spent creating content but also ordering it. Consider this option as if it were a secret weapon that cannot be seen. As the size of the web increases, it is more important to have the content well categorized.
For instance, if new content is created weekly on the blog, try to avoid cannibalization problems between articles. It also often happens that some blog articles take away the ranking of product pages in online stores, which is a problem about e-commerce sales. You have to be careful with the choice of keywords when you work on a blog and an online store within the same site. The cannibalization of keywords occurs when two or more URLs of a website are classified involuntarily by the same keyword.
I give you a small detail,
It is not always bad to have 2 URLs positioned on page 1 for the same keyword. Keep in mind that Google shows the pages in the SERPs according to the user’s intention. Therefore, it may be the case that 2 pages of the same website are displayed for a search term.
How to detect it?
To detect if we have keyword cannibalization, we can use a search command as simple as this:
Search in quotes for your “keyword.”
site: domain.com “keyword.”
“Generate organic traffic.”
site: abc.com “generate organic traffic.”
If you see more than one URL of yours in the SERPs, it means that you will have to review those pages because one may detract from the other.
How to fix it?
Google uses entities to find the context of a search term, and the more context you use the better. Entities are why it is difficult to rank for a keyword with a single piece of content. Now, you will understand why with a single piece or article on a keyword, it is very difficult to rank today. The entities send information to Google about the quality and quantity of a piece of content. For this, we must classify them. When working with the web structure, we must think about how a library is organized.
This point is crucial within the technical SEO audit whether the website is small or if the website is large. Wasting the time of google bots is not a good idea. Both the crawling and the subsequent indexing and organic positioning after a web page largely depend on Google’s ability to crawl the page. In many audits that I do, this is one of the main problems that I observe and that brings with it many problems with organic positioning of the site. Making bots easier to track and focusing on our website’s essential pages is one of the SEO skills that I like the most. I can assure you how some websites have changed overnight, correcting only this dramatic point quickly.
How to detect indexing problems?
The best place to detect crawl problems is within the Search console.
Inside the section – tracking> sitemaps
Here, we can check if there is much difference between the number of URLs sent to google and those actually indexed. The second thing that we must observe: the google index> indexing status. Check if there is a significant drop in the total number of indexed pages, which would indicate that we have a problem to solve.
How to fix it?
Google has stated that CSS and JS files should not be blocked in robots.txt. If you use another CMS or a custom website and verify that unimportant pages on the web are indexed, the tag “noindex” should be placed on them. As a website grows, this point is more crucial so that the page continues to be classified correctly.
Thin content issues
Thin content is another of the most common problems, especially in online stores, although it is often the case in blogs. Having thin content takes away potential from the rest of the domain. Pages with poor content tend to affect the rest of the pages that rank well. It is a technical problem that must be addressed and solved.
How to find it?
If users enter a page and leave very quickly, it is a bad sign. The content on that page may not be very complete or missing. Or perhaps, the relevant information is not given, and then users go to another website to find it. We can detect it very easily through google analytics, following these steps:
- Go to the behavior section
- Site content
- Landing pages
- Select the last 3 months and “show 250 rows.”
- URLs that have not had “sessions” or are very scarce in this time interval are observed.
How to fix it?
When we have detected which pages hardly get impressions or with a high bounce rate, then we can choose one of the following solutions:
Increase more content to those pages (text, images, video, etc.).
Put a “noindex” tag on the pages that are of little real value to classify. In this way, the potential of the main pages will be increased.
Check internal link
Not having internal links that pass authority from one web page to another is finding a possible indexing problem + not taking advantage of this SEO technique to grow organic traffic. Internal links are currently of great importance for search engines, and my advice is that you do this task in each of the URLs or pages of your website. Some small websites do not use internal links to transmit equity from one page to another, losing a significant part of their organic potential. Linking articles to each other or doing it from the blog to the online store adds extra value to google and the ranking of keywords.
How to find it?
For small websites, it can easily be done manually by visiting each of the pages and checking if there are internal links between the pages. For larger websites, more precise tools will be needed, such as Screaming frog SEO spider.
How to fix it?
Placing internal links from blog posts to the main pages you want to rank is a great SEO strategy. The same can be done for online stores. In addition, it is necessary to take advantage of the internal links to place the keywords in the anchor texts of the hyperlink. The key is to use different anchor texts naturally. This will transmit even more power to the landing page.
Structured data markup problems
To tell you the truth, most audited websites lack data markup. Webmasters do not usually use this google system to make the search engine understand much better what each page is about. Within SEO optimization, data markup can boost the ranking of keywords since it helps the search engine understand much better what each of the pages of a website is about. According to google, schema markup helps to classify a web.
How to find it?
Use the Search console to detect if there are data markup errors or the page is missing from them. Navigate through search console:
- Appearance in the search engine
- Structured data
You can test it on Google’s structured data tool.
How to fix it?
To fix this problem, use the same error detection tool to find out where the code problem is and to be able to solve it. If you use WordPress, you can use a plugin to make this task easier like this:
All in One Schema plugin. With it, you will get the Rich Snippet or a brief summary of the page in Google, Yahoo or Bing’s search results.
If you use a different CMS, you may need help from a programmer to implement it. You have to know that Google prefers the following formats:
- JSON-LD (recommended)
Implementing this point can be competitive and many web pages lack this complement.
Breadcrumbs for smooth navigation
Breadcrumbs provide one-click access to any pages without having to go back several times with the browser button. It helps our search engine positioning. Each link naturally places a keyword. Breadcrumbs can even appear in the search result in a special format. They improve the user experience, reducing the bounce rate and increasing the time spent on the website. Both factors affect a page’s ranking in searches.
How to activate it?
Follow these three ways to activate the Breadcrumbs:
- Custom code. It consists of editing the .PHP files of our website and including the code instructions that show the crumbs’ trail for each page. Requires knowledge of the WordPress programming API and its architecture.
- Unsupported Breadcrumbs theme. If the website has a theme that is not prepared to include breadcrumbs, a WordPress plugin can be installed with this feature. However, it requires editing some .PHP files and including the lines of code, following the plugin manual’s instructions.
- Theme with support for Breadcrumbs plugins. The fastest and easiest solution. It is enough to install the plugin with which our theme is compatible and configure it to have the appearance and functionality that interests us.
Backlinks are still one of the main SEO factors that Google takes into account to give authority and trust to a website. Google provides a different score for each backlink according to its criteria.
the score that you can give to a link that points to us from a free directory will be lower than the score we have from a website that deals with the same topic. Even so, you should always try to establish a frequency in time to carry out the obtaining of external links. Keep in mind that it is always more interesting to get links from other websites of the same category or niche to links that have nothing to do with our theme. But this we cannot always control. You may have links from pages that have nothing to do with your sector. Even new pages or with very little authority, which does not benefit either.
How to find it?
To find out if we have backlinks or not, we can use the following tools:
- Search console
- Open site explorer
- Monitor backlinks
How to fix them
We can use these tools to know the quality of the backlinks and know if we lack link potential to better rank our keywords. It is important to have good external links that give us authority and strength.
Check your sitemap
The sitemap is another of the technical SEO problems often recorded in Google forums or specialized sites. Some are common and easy to solve and others are not so. Here are some that do not need the assistance of a programmer to solve it:
Empty XML sitemap. It can be due to multiple factors. For example, it may be that the name of some files is incorrect. It is empty or that it is not well labeled. Either way, you have to look at your sitemap and check to see if it is empty. If not, check the URLs and verify that you have used the correct tags or have misplaced a file.
Compression error. This means that Google could not unzip the file, so you have to go back to your sitemap and check what happened to the compression. For prevention, it is best to re-compress and resend it.
Invalid URL. It is likely that by mistake, you have put some special or unsupported element in your URL: a comma. In this case, you should check all your URLs to determine if you have made a spelling mistake. The best thing to do in these cases is to check your URLs to carefully determine where the error is. When you find the problem, you fix it, update it and resubmit it.
These Technical SEO issues play a significant role in your site performance and rankings. If you have found and fixed the technical issues, you will see a good outcome quickly. Technical SEO allows you to obtain a competitive advantage. Others are not yet looking for it and because this advantage can be decisive in the battle for high positions in search engines. Focus on these problems because the technical issues can damage your dream website. You can fix most of the problems.