Last updated on 6th November 2024 at 17:21 by Alex Nicholas

Uncover the technical issues that are holding your rankings back and unleash the full potential of your Website with our detailed SEO audit checklist.

Without a shadow of a doubt, one of the best things a website owner can do is to give their site a clean technical bill of health by looking in detail at it from every angle, from duplicate content to schema markup to title tags. Often a site will have a number of unseen, underlying issues that could and probably are stopping it from reaching it’s full potential. So below is a rundown of all of the checks that need to be analysed.

However, as I’m sure you’re already aware, it would be really hard to carry out a technical audit without some specialist tools, so at the start of each section are the SEO audit tools needed to get the most out of your audit.

Step 1: Google Properties

Tools needed:

  1. GA Checker
  2. Google Analytics Debugger

Is Google Analytics installed

  • Check using: GA Checker

Why does this matter?

  • Without Google Analytics installed you're losing out on incredibly valuable data such as what your audience looks like, how they're finding you, and what they do whilst on your site. But most importantly it'll give you a strong indication as to whether or not you've been hit with an algorithmic demotion.

GA duplication Check

  • Check using: Google Analytics Debugger

Why does this matter?

  • It shows you error messages and warnings that will tell you if your analytics tracking code is set up incorrectly

Is Google Search Console set up

  • Check using: Visual

Why does this matter?

  • GSC is a critical tool that provides you with information on the number of pages indexed and organic traffic that you simply don’t get elsewhere, such as indexation issues, an overview of the external backlinks pointing to your site, a breakdown of your individual page performance including it’s average ranking position and impressions. It also shows you if you’ve been given a manual penalty due to link-building efforts that obviously belong to PBN's etc that must be addressed through disavow files. Basically, this is where Google will tell you if you have errors. You should also upload your sitemap files to GSC and update them/it as and when your content changes. Whilst this isn’t the most critical of things to keep on top of, it is a handy way to let Google read the important bits of your Website and let them know when you’ve added new pages.
  • You'll also get vital information on your core web vitals inside GSC that will help with aspects such as page speed insights

Step 2: Website Architecture

Tools needed:

  1. Screaming Frog
  2. SEMrush
  3. Sitebulb
  4. DeepCrawl
  5. Google Search Console

Pagination checks

  • Check using: Screaming Frog

Why does this matter?

  • Pagination is usually an eCommerce or news Website issue, as often sites will have many products or articles within a category and users will need to click on either the next/prev to move forwards or backward to view the next set of products or articles. This is done to make the site's information digestible, and by setting up your pagination correctly you're ensuring Google understand that the link you've pressed is pointing to the page you're about to view, therefore indicating the relationship between component URLs in a paginated series to search engines. Now whilst Google has publicly stated they aren't bothered by pagination as it's not a signal they use, I still believe it's better safe than sorry and recommend it's handled correctly.

Canonical checks

  • Check using: Screaming Frog

Why does this matter?

  • A canonical tag is a way to let search engines know that this URL is the master version should the same content appear on another URL within your Website. This can be problematic as duplicate content within a Website can dilute the master version and if done at scale can have a serious effect on the overall quality of the site. By setting up your canonicals correctly you help search engine crawlers better understand your site and vastly reduce the chance they'll show the wrong URL in the SERPS.

Internal linking checks

  • Check using: Screaming Frog & SEMrush

Why does this matter?

  • Internal linking is powerful, and I mean really powerful when handled correctly as they give Google signs that “this page is important”. They're especially important for eCommerce Websites due to the fact they often have multiple topics, and when used in conjunction with solid information architecture the results can be incredibly positive. Internal links should be topical and create silos to obtain the best results. However, it's even more critical that your internal links a) aren't broken and b) point to the correct page using anchor text that promotes a good user experience, i.e. makes it clear to the user where they'll be taken to when they click on the link.

Site visualisation checks

  • Check using: Screaming Frog & Sitebulb

Why does this matter?

  • By viewing your site's architecture visually on a map you can understand exactly where your pages are linking and most importantly show you if you have any orphan pages. It will help you utilise and manage the way bots crawl your site and help untangle complex Website architecture.

Internal redirects

  • Check using: Screaming Frog Redirect Checker

Why does this matter?

  • Internal redirects in their simplest form are redirections from one URL to another and force users and search engines to wait until there are no more redirects to step through. Unfortunately, the more redirects you have the more link equity you lose and the more equity you lose the worse your page's authority will be. Therefore if a URL goes from A to B to C to D, you need to cut out B and C and redirect from A straight to D.

Is robots.txt present

  • Check using: Google Search Console

Why does this matter?

  • If mishandled, robots.txt is the quickest and easiest way to ruin a site's rankings as instructs crawlers on how to crawl your Website. robots.txt is part of the robots exclusion protocol which is a set of Web standards that regulates how the Web is crawled and what access crawlers have to your content. Here is a guide to robots.txt and how to handle it.

Are pages correctly blocked by Meta Robots?

  • Check using: Screaming Frog

Why does this matter?

  • The noindex meta directive is a powerful tool in your SEO armoury and when handled correctly can be hugely beneficial to crawl management. The noindex tag tells Google to NOT index the page the directive has been added to, within it's search index. For example, if you have an old blog post that is no longer relevant by adding the noindex directive the post will eventually drop out of the SERPS. However, for the search engines to actually see the noindex directive the page(s) must not be blocked via the robots.txt. If the page is blocked via the robots.txt the search engines will not be able to see that your page(s) have the noindex tag applied to them and will continue to show up in the SERPS.

Site structure & silo set-up

  • Check using: Screaming Frog/Sitebulb

Why does this matter?

  • A well-defined silo will organise your Website's content in a logical and easy-to-understand manner and will consist of categories, subcategories, and products (if you're an eComm site). This creates a clear hierarchy.

Are URLs named correctly & optimised?

  • Check using: Screaming Frog/DeepCrawl

Why does this matter?

  • By using your keywords/phrases in the URL strings you're giving the search engines an extra (all be it small) hint that this is a page the searcher will be interested in and is, therefore, a tiny ranking factor, plus there's nothing worse than a long and ugly URL string. Make them clear with exact or partial matches of the keyword phrase the page is about.

Error pages

  • Check using: Screaming Frog

Why does this matter?

  • 404 pages should be shown to users when a URL no longer exists. A lack of a 404 page can lead to duplicate pages which is the direct opposite of what you should be looking to achieve.

Is an HTML sitemap used?

  • Check using: Screaming Frog

Why does this matter?

  • HTML sitemaps are designed for humans to read if they're lost on your Website. Usually, they're only for large sites and create a better user experience as they link to every page (or at least key page) on your site, therefore making it easier to navigate.

Is navigation easy to use?

  • Check using: Visual

Why does this matter?

  • Easy-to-use site navigation is one of the most important aspects of website user experience. UX is a big ranking factor so making your navigation clear and understandable.

Is important/good content within 4 clicks from home?

  • Check using: Screaming Frog

Why does this matter?

  • The closer an internal page is to the home page the better. Again, as above it creates a better UX but it also signals to Google that ‘this page is important if it's only 2, 3, or 4 clicks away from the home page. This isn't always possible or easy with large sites with many categories, sub-categories, and product pages but if a page is key to your success then try and keep it within 3 clicks.

Step 3: Security & Redirects

Tools needed:

  1. Screaming Frog
  2. DeepCrawl
  3. SSL Checker
  4. Security Headers
  5. Uptime Robot
  6. View DNS
  7. Sucuri SiteCheck
  8. MX Toolbox
  9. Wayback Machine
  10. SEMrush

Primary protocol – HTTP/HTTPS

  • Check using: Screaming Frog/DeepCrawl

Why does this matter?

  • Hypertext transfer protocol secure (HTTPS) is the secure version of HTTP and is the primary protocol used to send data between a Web browser and the Website. By using HTTPS you’re adding a layer of encryption that will increase the security of the data being transferred. This is critical when dealing with people’s banking details and personal information. Screaming Frog will immediately highlight any obvious security issues.

Do all pages redirect from HTTP to HTTPS?

  • Check using: Screaming Frog

Why does this matter?

  • By leaving pages as HTTP only you're potentially leaving a back door open for hackers to break into your site and commit all sorts of unpleasant crimes such as spying on personal information left by your customers, injecting content or links onto pages, or in worst-case scenarios gaining complete control of your site. Pages that aren't HTTPS will also display as ‘Not Secure' in Google and may put some users off using your site for important actions.

Does the site have a valid SSL certificate?

  • Check using: Google Chrome & SSL Checker

Why does this matter?

  • Your SSL (secure socket layer) are data files that digitally bind a cryptographic key to an organisation's details, and when installed on a Web server, activates the padlock we've all become familiar with checking for before we use a Website. It's important to give visitors confidence in your site, irrespective of whether or not you actually need one. However, if you do have an SSL certificate it must be valid. Invalid SSL certificates will be flagged by Google and may cause them to lower your rankings or remove you from the SERPs altogether if left unattended.

Is an HSTS policy in place?

  • Check using: Security Headers

Why does this matter?

  • HSTS (HTTP strict transport security) is a Web security policy that helps protect websites from protocol downgrade attacks or cookie hijacking. By using HSTS you're allowing your servers to declare that Web browsers should only use HTTPS connections.

Site uptime

  • Check using: Uptime Robot

Why does this matter?

  • Keeping your site online at all times is hyper-important! If your Website is hosted on poorly configured or flat-out bad servers then you run the risk of it going down. Basically being offline. If this happens then you're served a 500 error which is a server error code that can/will cause Google to demote your site or kick it off the SERPS completely if it happens regularly and for long periods of time. I recommend you pay for the best hosting you can afford, and one that's built to host the platform your site is built on, e.g. WP Engine is designed to host WordPress Websites.

Check the number of other sites on the server

  • Check using: View DNS

Why does this matter?

  • Tying into the above, having good hosting means you'll be on servers that only have a handful of other Websites hosted on it along with yours, such as a managed hosting plan. Basic-level hosting options that cost only a few pounds per month tend to have over a thousand other Websites on them along with yours. This will lead to poor site load times as the servers are always working at close to maximum performance, and if some of those sites become popular they draw server performance away from your site.

Broken internal links

  • Check using: Screaming Frog/DeepCrawl

Why does this matter?

  • Internal linking is an important aspect of SEO as it allows users to navigate from one page to another related page. Broken links lead to poor UX and a loss of user confidence. Broken links are also a bad signal to search engine crawlers as it's a sign you aren't looking after your site technically.

Javascript usage

  • Check using: Screaming Frog

Why does this matter?

  • Search engines have become increasingly good at reading and rendering Javascript, however, Javascript can and does cause issues. It can become complicated and can often not be fully rendered which means any SEO work you've done that sits within the Javascript can potentially be missed, so use Screaming From to check all Javascript is being rendered correctly. I also recommend using JS sparingly.

Are dynamic pages being served correctly?

  • Check using: Screaming Frog/DeepCrawl

Why does this matter?

  • Dynamic serving is a setup where the server responds with different HTML (and CSS) on the same URL depending on which user agent requests the page (mobile, tablet, or desktop). Most site owners don't need to worry about dynamic rendering as it's for Websites that are built using lots of Javascript, which most aren't. Search engines like Google can have a tough time crawling and indexing these types of pages so I advise you to stay away from relying heavily on Javascript.

Malware & security checks

  • Check using: Sucuri SiteCheck

Why does this matter?

  • It goes without saying that malware, viruses, and blacklisting are severe problems for any Website as it means your Website effectively has software in it that'll be spying on both you and your visitors, potentially stealing valuable data that will land you in a whole heap of trouble.

Is the site on any blacklists?

  • Check using: MX Toolbox

Why does this matter?

  • Having malware injected onto a Website will lead to it being blacklisted with all of the major search engines plus a host of Web safety applications such as Norton Safe Web, McAfee SiteAdvisor, etc. This in turn will lead to your site being removed from the SERPs, coupled with a dramatic loss in revenue.

Does the domain have a bad history?

  • Check using: Wayback Machine & SEMrush

Why does this matter?

  • As a matter of course, it's a good idea to look back at your Website's history if it's an aged domain. Over the years techniques used to rank sites have been problematic, to say the least, and can leave a footprint that might need addressing such as using PBNs that Google has caught up with. It's also a good idea to check a domain's history if you're thinking of buying one because as I've just mentioned if it's had an owner with a particularly liberal view on SEO then you're effectively wasting your time and money.

Step 4: Website speed check

Tools needed:

  1. Webpage Test
  2. GTmetrix
  3. Google Structured Data Testing Tool
  4. SEO Site Checkup
  5. Screaming Frog
  6. BuiltWith
  • Check using: webpagetest.org & GTmetrix

Why does this matter?

  • Website speed is a direct ranking factor, as confirmed by Google’s Zhiheng Wang and Doantam Phan who wrote, “The “Speed Update,” as we’re calling it, will only affect pages that deliver the slowest experience to users and will only affect a small percentage of queries. It applies the same standard to all pages, regardless of the technology used to build the page. The intent of the search query is still a very strong signal, so a slow page may still rank highly if it has great, relevant content. Therefore I recommend you pay close attention to how fast your site is. Aim for a fully loaded time of less than 1.5 seconds and a TTFB (time to first byte) of around 200 milliseconds.

Is structured data & Schema being used?

  • Check using: the Google testing tool & GSC

Why does this matter?

  • Structured data helps search engines understand what the content on Web pages is about, as well as showing rich snippets in the SERPs before a user clicks on a link through to a Website. These rich snippets can be anything from review rating stars to events or articles and can massively improve the click-through rate to your site. I highly recommend applying Schema markup to your site, specifically JSON-LD as it's Google's go-to standard of markup.

Is CSS & Javascript being minified?

  • Check using: SEO Site Checkup

Why does this matter?

  • By minifying CSS & JS, you're reducing file size and bandwidth of network requests, thus improving page loading times. Minification works by removing any extra spaces and crunching variable names whilst having exactly the same functionality as files that haven't been minified. The reason they aren't minified out of the box is that with the spaces between the CSS & JS files, it's much easier for developers to work on.

Are there any canonical errors?

  • Check using: Screaming Frog

Why does this matter?

  • Canonical errors are a common issue that usually comes with a straightforward fix. A canonical issue appears when search engines can access the same content via different URLs and attempt to index that content accordingly. This can cause confusion and duplication on the Web crawler's part so I recommend you make it clear through correctly implemented canonicalisation which piece of content is the master content.

What platform is the Website built on?

  • Check using: BuiltWith

Why does this matter?

  • To be honest, I was in two minds as to whether or not to include this section because if you're reading this guide it's because you want to know how to carry out a technical audit by yourself, therefore already know what platform your site is built on and whether or not it's got the ability to be fast. However, what BuiltWith does do is give you the ability to check out what plugins etc your competitors are running and if they're using a CDN (content delivery network), which they should be. The only real concern I have is Wix. If your site is built on Wix then you 100% must look to migrate to a better platform such as WordPress. Wix is not SEO-friendly, I'm sorry, but you cannot polish a turd.
  • Note on Wix – it has actually become a decent website builder these days after hiring some talented SEOs that are changing the way in which it operates.

Is a CDN in use?

  • Check using: BuiltWith

Why does this matter?

  • The main reasons you need a CDN are performance and security. CDNs deliver content from your Website to visitors via a network of servers around the world and they do it quickly. The reason it's so fast is that your site's content is delivered from the servers that are closest to the user, i.e. if someone from France accesses your Website then your content is served to them via servers in France. They also (and in my opinion, it's the main reason you should use a CDN) add another brilliant security layer to your site as they protect you from nasty things such as DDoS attacks that can take your site offline. CDNs are also quickly scalable during heavy traffic periods. I highly recommend using Cloudflare, which I personally use.

Step 5: Images

Tools needed:

  1. Screaming Frog

Are images being optimised?

  • Check using: Screaming Frog

Why does this matter?

  • Poorly optimised images will obliterate the load times of your pages and cause users to become hugely frustrated with your site, so head into Screaming Frog and find all images over 100kb and adjust their sizes to make them small, light, and fast. Alternatively, you can run an image compression plugin such as ShortPixel Image Optimizer to compress your images automatically. I personally use ShortPixel and really like it. Not only does it work seamlessly but it also works directly with Cloudflare to help speed up image download times.

Are ALT tags being used & used properly?

  • Check using: Screaming Frog

Why does this matter?

  • By adding ALT tags to your images you're adding weight to your pages in Google's eyes by giving clear hints as to what your page is about. ALT tags are also very important for Google images and Google Lens. You're much more likely to appear for image searches if you've described your images with keywords in your CMS backend.

Are there any broken images?

  • Check using: Screaming Frog

Why does this matter?

  • Broken (or dead) images are a clear sign that you aren't paying close attention with regards to the technical health of your Website and create a poor user experience. Running a Screaming Frog scan will show any broken images you have. This is even more important if you have a large site, especially an eCommerce site.

Step 6: Mobile

Tools needed:

  1. Responsinator

Responsive check

  • Check using: Responsinator

Why does this matter?

  • We live in the age of mobile so if your site isn't fit for mobile (I know you're still out there frustrating your users and losing money) then now's the time to sort it because you will be demoted because of it. The best option here is to use a responsive platform such as WordPress because it takes care of all of this on it's own, meaning your site (including images and videos) will fit into and work perfectly on any device, be it mobile, tablet or desktop.

Is the site using popups and/or interstitials?

  • Check using: Visual

Why does this matter?

  • There will come a point very soon when Google will take action against overly aggressive adverts, pop-ups, and interstitials. There's already clear evidence that the Panda algorithm has been tackling this issue so I recommend you use them carefully. Put yourself in your customer's shoes and ask yourself if you would be frustrated by the interruptions instead of looking to monetise aggressively. I'm not saying you shouldn't use these tactics, just use them softly.

Mobile navigation

  • Check using: Visual

Why does this matter?

  • When designing and building Websites today, they absolutely must be built from a mobile-first perspective. This means that the real estate on your site must be clear and concise for mobile users, and none more so than the navigation. The checks that should be carried out are a) are buttons easy to click, b) are they clearly labelled, c) are your most important pages close to the top, d) is there a good contrast. Without thoughtfully planned mobile navigation, your users will struggle to get around your site, meaning you aren't maximising your site's potential.

Step 7: Page & Element Analysis

Tools needed:

  1. SEO Site Checkup
  2. W3C
  3. Web Accessibility
  4. Screaming Frog

Are deprecated HTML tags being used?

  • Check using: SEO Site Checkup

Why does this matter?

  • Deprecated HTML4 tags are basically old constructs of modern HTML5 tags. If you have deprecated tags they will be clearly marked as such. Even though HTML4 will probably be supported for a while yet, I recommend upgrading to HTML5 as it's the modern standard (although both HTML4 & 5 are very similar) and works better in modern browsers.

HTML validation

  • Check using: W3C

Why does this matter?

  • By validating your HTML you're assuring the Hypertext Markup Language markup elements are syntax error-free. These syntax errors, such as extra spaces or open tags will make a Web page wildly different to how it's supposed to look. HTML errors will also stop the pages from rendering correctly. By validating your HTML you're making sure your pages are HTML error-free.

Accessibility checks

  • Check using: Web Accessibility

Why does this matter?

  • Web accessibility is an important part of creating a Website for people with disabilities. That means your software and hardware must meet the needs of people who have disabilities such as impaired sight, movement, hearing, and cognitive abilities. The World Wide Web must be freely accessible to everyone therefore all barriers must be removed by:
  1. Using a content management system that supports accessibility
  2. Making sure your content is structured in a logical manner
  3. Using descriptive ALT tags on your images
  4. Having clear and descriptive names on all links
  5. Using clear colour combinations
  6. Using columns and table layout carefully so screen readers don't encounter issues

Are there any duplicate page titles?

  • Check using: Screaming Frog

Why does this matter?

  • Duplicate page titles are separate URLs with the same page title. Your page titles are one of the most important aspects of on-page SEO as it's one of the first things visitors will see to help them understand what your page is about. By having duplicate page titles not only are you potentially confusing visitors but also giving search engines a headache because they'll have to decide which page is more relevant, especially if the content on the pages is similar. Having said that Google does rewrite page titles, but for best practice, all page titles must be unique.

Do page titles describe the page content?

  • Check using: Visual

Why does this matter?

  • As mentioned above your page titles are the first thing visitors to your site will see when they find you on the SERPs, so make sure your page titles let them know what your page is about by using the keywords they're expecting to see after they've performed the search. It's also best to use the keywords on the left of the page title.

Are any page titles missing?

  • Check using: Screaming Frog

Why does this matter?

  • Missing title tags is an issue because you're forcing the search engines to make one for you if indeed they decide to rank you at all. Now although the likes of Google already rewrite page titles from time to time, to fit the content to the searcher, I highly recommend all pages have descriptive titles written by either the Website owner or an SEO professional, based on keyword research.

Are Meta descriptions unique?

  • Check using: Screaming Frog

Why does this matter?

  • As above with page titles, Google will often rewrite meta descriptions based on your content, the search performed, and the user intent. Meta descriptions aren't a direct ranking factor but they are your first sales pitch to potential visitors. Google will also bold keywords that match the searchers' intent so make sure you write them in a way that describes your content and has an offer or introduction, depending on what you're trying to achieve.

Step 8: User Experience

Can I get anywhere on the site in a short number of clicks?

  • Check using: Visual

Why does this matter?

  • Every key money page/category should be at the forefront of your navigation and should ideally be accessible immediately from the home page, with all other pages only 2 to 4 clicks away. Your top-level navigation has a massively positive impact on user experience and indexation in the search engines if set out correctly, so without the ability to get from one page of your site to any other page quickly and easily you will lose both customers and money. Poor site usability can be a clear indication your site navigation is poor.
  • Using an internal search function is also recommended on large Websites, especially eCommerce sites as people use them, a lot!

Is the menu easy to understand & does it make sense?

  • Check using: Visual

Why does this matter?

  • Your menu links must describe what's on the page they're pointing to, therefore it must be clear and logical with both users and Google wanting to see the same thing.

Summary

It's pretty easy to build a fancy Website in this day and age, but it's also very easy to make mistakes with it that can stop it from ranking in the SERPs. Therefore I recommend you take the time to look closely at your site with my technical SEO checklist and an analytical mindset to make sure you aren't missing something that is hurting your site's performance.

Go through your site with as many of the tools in this article as possible (especially Screaming Frog) and systematically work through the issues, I promise you won't regret it.