Let's follow up on our previous guide on tech SEO audit essentials and tools and examine how to fix the identified tech SEO issues. Today, we will explain what should be done with the broken tech SEO elements crucial for your website performance.
This is the guide on how to make a basic tech SEO fix to maximize the outcome of Googlebot crawls, improve website usability, and boost web page rankings.
1. Fixing Crawlabilty/Indexability: Robots.txt, Meta-robots tag, Sitemap.xml
First and foremost, auditors check crawlability and indexing directives saved in robots.txt and meta-robots tag and ensure that sitemap.xml contains all indexable pages.
Robots.txt
This file contains crawling instructions applied site-wide to restrain search bots from accessing certain file types, directories, and pages. Below, you can see the contents of robot.txt from the Wefblow’s website. As you can see, the webmasters blocked the crawlers from accessing the user dashboard section and API sections:
NOTE:Â
- Google Search Central explains that not every crawler will respect robot.txt directives as those of trusted search engines do. However, you can make an SEO fix and block a web page from those bots with a login wall.
- Inbound links from referral websites still might be indexed even if disallow rules forbid Googlebot from crawling them. You can't prevent this only by setting a login wall or using a nofollow directive in <meta name= "robots"/>.
Meta Robots Tag
Setting nofollow/dofollow and noindex/index rules through meta robots tag allows you to manage the distribution of page authority to external websites. Do a minor SEO fix: set index/nofollow meta robots directives so your web page will appear in search results and maintain its Page Rank.
Sitemap.XML
MOZ experiment proved that by submitting sitemap.xml to search engines, webmasters can speed up site crawling and indexing new pages. So, immediate sitemap.xml submission allows SEO specialists to increase search visibility and start to grow rankings as fast as possible.
NOTE: MOZ experts still don’t recommend sending sitemap.xml to Google unless you’re 100% sure your URL structure is perfect.
2. Revision of Site Structure
The site structure reflects the nesting of its subdomains, categories, and pages. Technical SEO audit aims to spot and correct inconsistencies in a flat hierarchy tree, which usually looks like this:
There’s a rule of thumb for an SEO fix like this: users must be able to access essential pages within 3 clicks away from the homepage. If so, then a site structure is considered user-friendly and SEO-friendly.
Otherwise, consider moving your pages to a dedicated subfolder/subcategory with lesser click depth. Alternatively, you can navigate users and crawlers to these deep pages with proper internal linking.
Other points to consider:
- Navigational Items. Make them noticeable and conveniently placed. Group subcategory links in dropdown lists attached to category buttons. Avoid two-dimensional and large-scale dropdowns. These commonly don’t fit screen size while mobile browsing and hurt usability.
- URL Structure. Follow an all-time SEO fix basics and create human-readable URLs. Always keep crawl depth in mind – implement filters and tags so users can navigate through listings independently.
3. Fixing Internal Links
Improving internal linking is paramount. It ensures:
- Faster discovery of pages for users and search engine bots
- Page and domain authority distribution between pages for better rankings
- Improved UX due to seamless and logical website navigation.
Technical SEO auditors must collect and fix broken links and orphaned page URLs.
Broken internal links return with 404 or 400 response codes, so you can easily filter them out in tech audit reports. Alternatively, you can explore separate reports in online tools: “Internal Linking” in Semrush and “Internal Link Opportunities” in Ahrefs.
There are variations of SEO fixes for broken links depending on circumstances:Â
- The link is broken due to a changed URL structure or misprinted URL text. The best way is to replace it with search-and-replace plugins available for your CMS. Don’t forget to manually replace old URLs with new ones in the navigation menus.
- The link leads to non-existing pages. In this case, you’ll need to remove broken links and reconsider what relevant pages you can link to.
Even though Google considers soft 404 pages an acceptable SEO fix, you need to reduce their number. We’ll touch on that further as we get to page response codes.
4. Analysis of Page Speed Insights Metrics
As previously mentioned, site speed metrics directly contribute to SERP rankings. To make an analysis of PageSpeed Insights conclusive, we recommend you monitor these 3 Core Web Vitals metrics and set them as primary goals for your tech SEO fix.
#1. Largest Contentful Paint (LCP) for Loading Performance
LCP indicates how fast your website fully renders the biggest visual item of the page. The point is that key visual elements might be different depending on page content and purpose, but it's the one that defines its relevance to users. So, in the case of this blog post, it is likely to be a text block.
The green LCP score is below 2.5 sec.
SEO fix options to improve LCP:
- Keep Service Response Time Low. Enable server-level page caching.
- Optimize Images. Try bulk image optimizers and compressors to maintain quality while reducing the size of hefty images. Also, switch to .webp format, which most web developers generally endorse.
- Use CDN. CDN minimizes server response time by routing user requests to the nearest edge server. These globally distributed servers ensure faster retrieval of necessary assets.
- Minification of CSS, JS, and HTML code. It can be handled using plugins. If you're dealing with a Webflow site, you can harness its built-in minification feature.