Maximising Website Performance with Technical SEO Audits

In this blog, we discuss what technical SEO is, how to conduct a technical SEO audit & what some of the factors of successful technical SEO are.

Resources
February 19, 2021
SEO
Back Arrow
View all posts
Nathan Watkins

When it comes to getting the very best out of your businesses website and how it performs in the SERPs, there's much more to consider than just choosing the right keywords to rank for. A massive part of the field of SEO is the idea of Technical SEO, and marketers all across the industry know that to maximise performance, your website needs to be succeeding not just in its on-page optimisation, but in the backend too. In this blog article, we will discuss what technical SEO is, how to conduct a technical SEO audit and what some of the different factors of successful technical SEO are.


What is Technical SEO?

In the same way that on-page optimisation refers to the different ways an SEO can improve a website's on page performance, Technical SEO refers to optimising the way your site is performing in the background. When considering Technical SEO, we refer to factors such as your website's site speed, how your website is crawled and indexed by search engines, and any other technical process designed to improve your website's visibility. Technical SEO is not concerned with any of the website's actual content; its primary goal is to optimise your website's infrastructure and help search engines access, crawl, interpret and index your website without any problems.


What is a Technical SEO Audit?

A technical SEO audit is where you conduct a full analysis of your website's technical setup to ensure that it can be crawled, indexed and ranked appropriately by web crawlers without issue. When conducting a Technical SEO audit for your website, there are loads of useful tools you can take advantage of to gain a greater understanding of your website's technical capabilities, such as:

  • Screaming Frog: Screaming Frog's SEO Spider is a website crawler tool that allows you to improve website performance by highlighting common SEO issues. Screaming Frog can be used free of charge to crawl 500 URLs per day.
  • Search Console: Google Search Console is a web service that allows users to check the index status of the pages of their websites, as well as check for any issues and errors across their websites.
  • Google Tag Manager: GTM is a very useful tool for managing all of the different tags that you use to track your website’s performance. When performing a technical SEO audit you may want to check in on whether all of your tags are working as they should.
  • GT Metrix: GT Metrix is a useful tool that can be used to assess your websites' speed. It does so by providing different "grades" for various aspects of the website that can be optimised to achieve faster speeds.
  • PageSpeed Insights: Google PageSpeed Insights is another tool that can be used to evaluate website page speed. It does so by analysing the content on a web page before generating suggestions to make that page faster.
  • Siteliner: Siteliner is a useful tool that allows you to check for any duplicate content on your website or between different sites. Duplicate content can have a detrimental impact on your wider SEO endeavours.

Technical SEO Audits: Checklist

When looking to conduct technical SEO audits on your businesses websites, the best first step to take is to compile a checklist of all the different aspects of your website that you want to analyse and eventually improve using some of the aforementioned tools. Your checklist may be different from ours depending on how in-depth you want to go with your technical SEO audits, but some reasonable grounds to cover when conducting an audit are:


  • Analysing the site speed of your website.
  • Checking your sitemap.
  • Examining your Robots.txt files and Meta Robots directives.
  • Checking Canonicalisation.
  • Checking whether Hreflang tags are working correctly.
  • Evaluating the mobile usability of your website.
  • Seeing whether your redirects have been set up correctly.
  • Checking your site has no crawl errors or status codes.
  • Checking for insecure content.
  • Analysis of Metadata.
  • Evaluation of Schema Markup.
  • Check that there's no duplicate content across your website.
  • Evaluate your Header Tag structure.
  • Evaluate image alt attributes.


Site Speed

Site speed is something that is very important you get right when analysing the performance of your website. Not only is having good site speed a valuable ranking factor that directly influences your sites ranking potential, but it can also impact performance indirectly. For example, by increasing the bounce rate of your website due to irritatingly slow page load times. Using tools such as GT Metrix and PageSpeed Insights, you can gain an insight into how fast your website and runs and the areas that it suffers, such as (but not limited to) the following:

  • JavaScript: While JavaScript can enhance your pages' look, it can also affect the time it takes for them to load. You can improve your website's speed by deferring the parsing of JavaScript so that it only loads when needed.
  • Image Sizes: The images on a page can sometimes be much larger than they need to be, negatively affecting page load times. Optimising the sizes of the imagery used throughout your website can have a positive impact on site speed.
  • Enable Browser Caching: Enabling browser caching allows files from your web page to be stored in the user's browser. This speeds up the process by preventing these pages from needing to be loaded entirely again.

Hosting

The hosting software associated with your website can also have a detrimental impact on your site speed and your wider SEO efforts. You’ll need to access your server directly in order to manually check for any hosting issues, the most common of which being a notably slow site speed and issues with the Top Level Domain (TLD). If your website has the incorrect TLD, you need to make sure the country IP address used is the most relevant to the country your website operates in. For example, if your website primarily operates in the UK, then your TLD should be a .co.uk rather than a .com. Or if your website has two of these domains, you would need to redirect the .co domain to the .com version. Investigating hosting issues can seem very complicated and more of a departure from standard SEO checks than some of the others in this list, so working closely with your developers can make this process much easier.

Insecure Content

Insecure content simply refers to HTTPS, or Secure HyperText Transfer Protocol; every page on your website should be using HTTPS instead of the old, much less secure HTTP URL structure. Screaming Frog’s SEO Spider tool can be used to easily see a full list of every page and their respective addresses. Pages using HTTP should be updated to redirect to HTTPS addresses as soon as possible.  

Sitemaps

A sitemap allows web crawlers such as Googlebot to easily locate and identify the pages that you would like to be indexed on your website. When conducting a technical SEO audit, you will need to check that your sitemaps are crawlable and contain no errors, broken links or robots.txt directives that would otherwise affect the indexing process.

Tools such as Search Console allow you to quickly check for such errors after uploading your most recent sitemap to the tool.

Robots.Txt & Meta Robots

Robots.txt files and Meta Robots directives are two snippets of code that you can use to provide more information to Google on which pages you would like to be indexed and in what ways you would like them to do so. The most common uses for both are to highlight individual pages to Google that you do not want to be indexed (for example, pages with no inherent value such as Cookies policy pages). In your technical SEO audit, you can check both of these code snippets by using Screaming Frog’s SEO Spider. You need to ensure that both are being correctly applied and are not accidentally blocking web crawlers from indexing pages that you do want to be indexed.

Hreflangs

Hreflang tags are code snippets that tell web crawlers when a page has multiple language variations. This prevents crawlers from viewing the page as duplicate content and enables Google to serve the correct language version of the page to the relevant nationalities/users. While this might not be necessary for every website, sites where Hreflangs are present will need to be checked to see if they are working as they should and whether they are accounting for all of the language variations they need to.

Mobile Usability

Gone are the days where websites would simply be viewed on a desktop computer; as of January 2021, over 56% of monthly site visits were made by using a mobile device. It is imperative that your website is set up to be as mobile-friendly as possible, which is something that a technical SEO audit will also examine. Google Search Console provides you with a very handy mobile usability report. This report can highlight issues such as content being too wide, text fonts being too small, clickable elements being positioned too close together for mobile users, and much more.

Canonicalisation 

Canonical tags are snippets of code that you apply to web pages to instruct web crawlers on which version of a page needs indexing. This is commonly seen in websites that sell products with near-identical page content, such as furniture with the slightest of deviations between products (e.g. colours, materials etc.). Since the pages are so similar, it is best to have only one version of the page be indexed; to decide which page is the "canonical" one. Consistency in the application of these tags will need to be checked, as well as every page having the correct self-canonicalisation tags applied (even pages that don't have multiple versions need to be canonicalised to themselves).

You can check all of your canonical tags at once by using Screaming Frog’s SEO Spider.

Redirects

Put simply, having too many redirects and too long of a redirect chain will slow your website down and make it frustrating for a user to navigate. By crawling your website with a tool such as Screaming Frog, you can identify all of the instances of pages redirecting and see if you can better optimise this process. You can do this by, for example, eliminating redirect chains on your website by having the first redirect take the user immediately to their target page. A redirect chain is simply where a user is taken to multiple redirecting pages before landing on their desired page, and it is something you want to avoid happening. 


You should also ensure that you are using the right redirects for the correct pages. A 301 redirect, for example, is used when a page is never going to return; a 302 redirect is a temporary measure for pages that should eventually return to your site. Ensure that you are using these in the right instances, or website performance and usability can suffer.

Status Codes

Status codes are some of the most obvious errors that will present themselves to you when completing a technical SEO audit, and there are a few variations you need to be aware of in how to deal with them. 400 errors, for example, will usually signify that a page is missing, and 500 errors concern any server-side errors in the accessing of pages. All of your web pages should either be displaying a 200 code (meaning that the page is fine) or a 300 redirect code if you are using redirects on these pages. To deal with these error codes, the most common solution is to crawl your site and then set up a redirect from the missing pages to the most relevant alternative pages on your site (making sure to avoid redirect chains, of course!).

Metadata

Metadata is something that we have covered previously in our blog dedicated to on-page optimisation. Meta Titles and Meta Descriptions should be found on every page and are the snippets of descriptive content you see when viewing a SERP. It is imperative that metadata be of an optimal length. If your Meta Titles & Descriptions are too long, they can be truncated, which means you miss out on including all of the information you could be giving to the user, but if they're too short, they can be ignored by the search engines entirely. Meta Titles should be below 66 characters in length, and Meta Descriptions should be between 120 and 155 characters in length (alongside all of the keywords you are trying to target on each page).

Campaign Tracking

If you are currently tracking the performance of your website and any marketing campaigns through tagging (which we would highly recommend you do), then you may want to check that your tags are still running as they should. If you’re using Google Tag Manager, then you can easily check this with the Google Tag Assistant Chrome extension. Otherwise, you can simply click around your website for a bit and see whether the data from your actions populates in Google Analytics. If it doesn’t, then there may be issues with the code.

For more information on tagging and Google Tag Manager, you can check out our dedicated blog on using Google Tag Manager.

Duplicate Content

Duplicate content on your website can come in two primary forms; content on your website being located on multiple different pages (which can be addressed using canonical tags), or the more serious issue of having content that exists on another website. Whether you have unintentionally copied another website's content a bit too closely or vice versa, the content will need to be changed to make it more unique. Pages with unique content are seen as much more valuable by search engines and can be a reasonably significant ranking factor for your website. Useful tools for checking for duplicate content include Siteliner and Copyscape.

Header Tags

Header tags are crucial in the structure of your website's pages, and a lack of a clear header tag structure can negatively influence your website's rankings. You need to make sure that every page has a H1 title at the top of the page, that the longer stretches of content are correctly segmented into H2 and H3 paragraphs, and that there isn't more than one H1 tag per page.

Schema

Schema is the coding language you apply to pages to convey context, messages and details to the search engines. With a technical SEO audit, you can check whether there are any errors with the Schema Markup coding by using Search Console, which can easily highlight what the errors are and where they are. You can also use the Structured Data Testing Tool to do this.

For more information, you can read our full blog dedicated to Schema and SEO here.

Image Alt Attributes

An image alt attribute's primary purpose is to describe an image to the visually impaired whilst also providing an opportunity for keyword optimisation. A crawl of your website using a tool like Screaming Frog will present you with a list of every image on your website, and from this, you will be able to highlight any missing tags or tags that can be optimised further than they are currently.



As you can see from this checklist, when conducting a technical SEO audit, there is a whole host of points that need to be assessed and optimised in order to get your website running as smoothly as possible. From the speed of your website and technical hiccups in the background, to the optimisation of your website for mobile, these are all factors that add up together and can have a considerable impact on your website's performance. For more information on the field of SEO, you can visit our SEO services page here, and you can always get in touch with us if you're interested in working on a new SEO project.


Fancy working with a results focused digital agency?

Get in touch

Get in touch

01202 037 746
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.