Home
/
Blog
/
How to Carry Out Site-Wide Tech SEO Fix in 8 Steps
SEO fixes graphic image

How to Carry Out Site-Wide Tech SEO Fix in 8 Steps

Publish date
February 2, 2024
Author
Category
This is the guide on how to make a basic tech SEO fix to maximize the outcome of Googlebot crawls, improve website usability, and boost web page rankings.
Table of contents

Let's follow up on our previous guide on tech SEO audit essentials and tools and examine how to fix the identified tech SEO issues. Today, we will explain what should be done with the broken tech SEO elements crucial for your website performance.

This is the guide on how to make a basic tech SEO fix to maximize the outcome of Googlebot crawls, improve website usability, and boost web page rankings.

1. Fixing Crawlabilty/Indexability: Robots.txt, Meta-robots tag, Sitemap.xml

First and foremost, auditors check crawlability and indexing directives saved in robots.txt and meta-robots tag and ensure that sitemap.xml contains all indexable pages.

Robots.txt

This file contains crawling instructions applied site-wide to restrain search bots from accessing certain file types, directories, and pages. Below, you can see the contents of robot.txt from the Wefblow’s website. As you can see, the webmasters blocked the crawlers from accessing the user dashboard section and API sections:

Viable robots.txt example - screenshot

NOTE: 

  • Google Search Central explains that not every crawler will respect robot.txt directives as those of trusted search engines do. However, you can make an SEO fix and block a web page from those bots with a login wall.
Google's disclaimer on robots.txt - screenshot
  • Inbound links from referral websites still might be indexed even if disallow rules forbid Googlebot from crawling them. You can't prevent this only by setting a login wall or using a nofollow directive in <meta name= "robots"/>.

Meta Robots Tag

Setting nofollow/dofollow and noindex/index rules through meta robots tag allows you to manage the distribution of page authority to external websites. Do a minor SEO fix: set index/nofollow meta robots directives so your web page will appear in search results and maintain its Page Rank.

Sitemap.XML

MOZ experiment proved that by submitting sitemap.xml to search engines, webmasters can speed up site crawling and indexing new pages. So, immediate sitemap.xml submission allows SEO specialists to increase search visibility and start to grow rankings as fast as possible.

NOTE: MOZ experts still don’t recommend sending sitemap.xml to Google unless you’re 100% sure your URL structure is perfect.

2. Revision of Site Structure

The site structure reflects the nesting of its subdomains, categories, and pages. Technical SEO audit aims to spot and correct inconsistencies in a flat hierarchy tree, which usually looks like this:

Flat site architecture - schematic art, electric crimson background

There’s a rule of thumb for an SEO fix like this: users must be able to access essential pages within 3 clicks away from the homepage. If so, then a site structure is considered user-friendly and SEO-friendly.

Otherwise, consider moving your pages to a dedicated subfolder/subcategory with lesser click depth. Alternatively, you can navigate users and crawlers to these deep pages with proper internal linking.

Other points to consider:

  • Navigational Items. Make them noticeable and conveniently placed. Group subcategory links in dropdown lists attached to category buttons. Avoid two-dimensional and large-scale dropdowns. These commonly don’t fit screen size while mobile browsing and hurt usability.
  • URL Structure. Follow an all-time SEO fix basics and create human-readable URLs. Always keep crawl depth in mind – implement filters and tags so users can navigate through listings independently.

3. Fixing Internal Links

Improving internal linking is paramount. It ensures:

  • Faster discovery of pages for users and search engine bots
  • Page and domain authority distribution between pages for better rankings
  • Improved UX due to seamless and logical website navigation.

Technical SEO auditors must collect and fix broken links and orphaned page URLs.

Broken internal links return with 404 or 400 response codes, so you can easily filter them out in tech audit reports. Alternatively, you can explore separate reports in online tools: “Internal Linking” in Semrush and “Internal Link Opportunities” in Ahrefs.

There are variations of SEO fixes for broken links depending on circumstances: 

  • The link is broken due to a changed URL structure or misprinted URL text. The best way is to replace it with search-and-replace plugins available for your CMS. Don’t forget to manually replace old URLs with new ones in the navigation menus.
  • The link leads to non-existing pages. In this case, you’ll need to remove broken links and reconsider what relevant pages you can link to.

Even though Google considers soft 404 pages an acceptable SEO fix, you need to reduce their number. We’ll touch on that further as we get to page response codes.

4. Analysis of Page Speed Insights Metrics

As previously mentioned, site speed metrics directly contribute to SERP rankings. To make an analysis of PageSpeed Insights conclusive, we recommend you monitor these 3 Core Web Vitals metrics and set them as primary goals for your tech SEO fix.

#1. Largest Contentful Paint (LCP) for Loading Performance

LCP indicates how fast your website fully renders the biggest visual item of the page. The point is that key visual elements might be different depending on page content and purpose, but it's the one that defines its relevance to users. So, in the case of this blog post, it is likely to be a text block.

The green LCP score is below 2.5 sec.

SEO fix options to improve LCP:

  • Keep Service Response Time Low. Enable server-level page caching.
  • Optimize Images. Try bulk image optimizers and compressors to maintain quality while reducing the size of hefty images. Also, switch to .webp format, which most web developers generally endorse.
  • Use CDN. CDN minimizes server response time by routing user requests to the nearest edge server. These globally distributed servers ensure faster retrieval of necessary assets.
  • Minification of CSS, JS, and HTML code. It can be handled using plugins. If you're dealing with a Webflow site, you can harness its built-in minification feature.
This is where Marketechy steps in to develop a plan of work tailored to your growth marketing, metrics, and budget.
Talk to Us
White word balloon icon

#2. First Input Delay (FID) for Interactivity

FID indicates the timespan between the First Contentful Paint and the moment the user can start to interact with the interface. The thing is that FID depends on the data collected through Chrome User Experience Reports (CrUX). 

If your website didn't provide sufficient field data to CrUX, you can rely on the Total Blocking Time (TBT) calculated from the Lighthouse lab data. The green score for TBT is below < 300 ms. Remember that TBT and LCP account for 55% of your PageSpeed Insights score.

SEO fix options to improve FID:

  • Defer JS. The point is to delay JS files until the main page content renders completely. The easiest way is to install plugins that defer JS code parsing automatically. Alternatively, developers can plug the defer attribute – <script src=" code.js" defer></script> – either manually or programmatically.
  • Delay JS. Delaying JS execution until user interaction works similarly. To set timeouts manually, you should insert the setTimeout () function and specify the delay length in ms.
  • Prefetch DNS Requests. Prefetching is a reasonable tech SEO fix in case you have hefty embedded elements with third-party code, for example, video or audio players. You may enable DNS-prefetching manually by inserting the "rel=dns-prefetch" attribute into the "head" section or use a plugin.

Note that JS code minifications benefit FID as in the case of LCP.

#3. Cumulative Layout Shift (CLS) for Visual Stability

Gauging CLS is also essential for technical SEO audit services, as this metric weighs 25% of the total PageSpeed score. CLS indicates visual stability of the layout during the period with no user interactions. Your target value for green score should be at least 0.1.

SEO fix options to improve CLS:

  • Set the Aspect Ratio through CSS. Additionally, you must ensure that images and video containers have definite width and height attributes. Thus, the browser will get the block's dimensions before rendering the page.
  • Manage Dynamic Content. In the case of dynamic content, JS delaying does the job, too.

As you can see, improving Core Web Vitals metrics takes some expertise and effort. So, if you feel like spinning your wheels and the low scores persist, applying for professional assistance with a site-wide SEO fix would be wise. 

5. Identifying and Fixing Code Issues

Search engines rely on clear code to read webpage content correctly. Here are the core vitals that auditors must inspect.

Meta Tags

These are paramount for standing out in SERP and informing searchers of what to expect from visiting your website. We already covered the key meta title and meta description requirements in our article dedicated to the on-page SEO checklist.

But keep in mind that re-optimizing meta titles and descriptions with the keywords that the page actually ranks for (and these might change over time) or the new keywords you’d like to switch to is necessary to maintain and grow search rankings. So keep an eye on search trends and analyze your competitors to stay on top of it.

Rel= "canonical" Attribute

Canonicalization informs search engine crawlers that there's a primary version of a certain page, and it is prioritized for indexing. The common SEO fix case is a missing rel=" canonical" attribute.

Luckily, most modern website builders support autogenerated global canonicals. So does the Webflow as it sets self-referencing rel=" canonical" values.

It's worth mentioning that redirecting from HTTP to HTTPS version serves the same purpose as canonicalization through the rel=" canonical" attribute. So, it makes sense if you've implemented the HTTPS protocol recently.

Also, remember that Google suggests that every URL listed in sitemap.xml is canonical.

Hreflang Attribute

Filled-in hreflang attribute helps search engines recognize that you run a multi-language website. Thus, it can serve users with search results that meet their locations and language preferences.

The most straightforward SEO fix for hreflang is to edit your sitemap.xml. For instance, if you run a website both in Spanish and English, you must add 2 corresponding <xhtml:link> entries:

Valid <xhtml:link> entries in sitemap.xml  - screenshot

Structured Data Markup

Schema markup code is vital for higher search visibility, impressions, and click-throughs. We’ve made an exhaustive Schema implementation guide, so check it right now to learn how to inspect, validate, and implement structured data markup.

JavaScript

When JavaScript breaks, search engines cannot render interactive items and, hence, index the page properly. To ensure that JS code runs correctly:

  • Doble Check Robots.txt File. Ensure that Googlebot can access JS assets.
  • Enable JS Rendering for SEO Audit Tools. Add reporting on JavaScript issues to crawl reports to inspect troubled URLs further.
  • Use the Google Search Console (GSC) URL Inspection Tool. It can point out directly whether the Googlebot renders your JS code correctly. If not, it might be caused by excessive JS loading timeouts.

6. SEO fix for HTTPS Issues

The technical SEO audit service primarily checks whether these are some of the following issues:

  • SSL/TLS Protocol Up-to-Dateness. You can check the details on the SSL certificate provider and its validity term in Google Chrome directly.
  • Mixed Content Issue. These stem from browsers requesting some types of content via HTTP connection instead of HTTPS. To detect what exact files cause mixed content issues, try scanning your URLs with SSL-check crawler.

7. Fixing 3XX, 4XX, and 5XX Response Codes

 Managing 3XX, 4XX, and 5XX response codes is a massive part of the SEO fix process. Let’s discuss what you should do about them.

3XX Response Codes

Permanent 301 redirects are considered a healthy SEO practice that doesn’t waste the crawl budget. But 302 and 307 redirects are temporary. Webmasters typically use them when changing domain names, moving between hosts, or shutting pages down for maintenance.

However, the 3XX redirects slow down the page load, meaning they affect Core Web Vitals and search rankings, too. Mindless redirection might cause redirect chains and loops that confuse crawlers and users. Substitute those out-of-place redirects with permalinks but remember that sparingly used redirects are generally OK.

4XX Response Codes

SEO specialists typically look out for 404 error pages as they highlight the gaps in internal linking.

Even though some webmasters can set soft 404 error pages, there's a catch. See, those don't return the 404 or 410 code, and Googlebot can still crawl and index them. Therefore, it’s a questionable SEO fix method.  Soft 404s can eat up a great deal of the crawl budget if you don't link them to relevant pages.

5XX Response Codes

It is worth assessing server-side errors in dynamics. Those are hard to predict, but mapping them on the timeline helps to see whether some of them reoccur systematically.

We'd primarily recommend you focus on response code inconsistencies. In particular, you might look for URLs that return 503 code as it’s sure sign of server overload.

Response codes analysis in Screaming Frog - screenshot

8. Identifying Mobile UX Issues

Google retired the Mobile Usability report from GSC in December 2023. That said, you can glean similar insights on mobile friendliness by checking with Lighthouse reports.

Illegible Font Sizes

Lighthouse analyzes font sizes to detect which pages aren’t optimized for mobile viewers. An illegible font size issue occurs if 40% of a webpage is written with <12 font size.

Illegible font size issue reveled by Lighthouse -  screenshot

The SEO fix for this problem implies adjusting the corresponding CSS styles. No one-size-fits-all solution exists, so re-test the web page after each font tweak.

Incorrect Scaling of Tap Targets

Lighthouse flags tap elements as incorrect when:

  • Those are smaller than 48 x 48 px
  • At least 25% of their area overlaps with other items within the 48 px diameter
Revealing incorrect tap targets scaling with Lighthouse - screenshot

Incorrect Meta Viewport Tag

Lighthouse primarily reports on a null <meta name="viewport"/>. The proper tag should look like this:

<meta name="viewport" content="width=device-width, initial-scale=1">

Normally, meta viewport contains initial-scale=1.0, and website exposure goes off if the value is >1.0 or set to zero. Occasionally, there’s a situation when the <head> section contains more than one meta viewport tag.

So, here are the go-to suggestions on a site-wide tech SEO fix. But don’t rush with rolling up your sleeves and diving headfirst into site audit and troubleshooting. Consider the benefits of hiring a professional team of talented and seasoned SEO specialists.

Resolve SEO Fix Essentials Nice and Easy with Marketechy

You can get much more out of your tech SEO audit with Marketechy’s team by your side. Our SEO specialists carry out a full-fledged technical SEO overhaul and measure its impact. Make regular tech SEO a powerful driver for your overall digital marketing efforts.

Ready to grow your online presence and quality leads? Drop us a line!