Technical SEO Audit: 5 Key Issues to Cover

Notice: Undefined variable: post_id in /home/foundco/public_html/wp-content/themes/pivot-child/inc/content-post-single-sidebar.php on line 48

When it comes to technical audits, a lot of people just rely on the outputs of tools to inform where the focus should be. While tools are fantastic and some of the crawling configurations will give you everything you need as a starting point, they are just that – a starting point.

Overlooked elements of technical SEO.

Technical SEO isn’t just “can my site be crawled and indexed”, it’s “can my site be crawled and indexed effectively“. You need to look beyond indexation levels and into what is actually driving them, and then identify anything that could be limiting your performance against your objectives, and that means things like:

  1. Site structure
  2. Internal linking
  3. Canonical tag setup
  4. Faceted navigations
  5. Asset optimisation

1. Site structure.

A lot of people shy away from site structure, opting to just accept what they are working with and look at ways to build on it rather than weighing up the benefits changing a structure might have.

A lot of newer websites don’t tend to run into issues with site structure – they follow logical folder structures, URLs make sense, and navigations reflect the core pages you want to display. But some still fall victim to poor planning or ad hoc additions.

Things to look out for here include:

  • Page depth levels – use Screaming Frog or your preferred crawling tool to see where pages on the website actually sit; are there unnecessary levels between pages, are there pages that should be closer to the homepage/further away?
  • Orphan pages – where do these come up on the site, why are they appearing, where should they sit?
  • Folder structure – do all of your sub-pages sit in the correct folders, is there any duplication of subcategories across different category levels?
  • Navigation – are all of your key pages in your main navigation, are any duplicated in the footer, how many people are using the links in the navigation to get around your website?
  • Site search – if you have a search functionality on the website, how are people using that, is there a reason they’re using that instead of following your folder structures?

Cleaning up site structures will not only help crawlers navigate your website more easily, but also help them understand where the important pages actually are on your site and why they should be seen as such. In turn, you’ll also be improving the experience for your users – a win-win.

2. Internal linking.

Internal linking is possibly the most overlooked part of technical SEO. It is a map of your website, allowing crawlers and users to locate and access different pages where they are related to the page they are on to begin with.

This gets broken down into two streams:

  1. Hierarchical linking
  2. Contextual linking

Hierarchical linking covers things like the navigation, footers, and sitemaps – setting out a clear structure of importance across the website. This acts as a structural map of your website.

Contextual linking happens in the actual content on the pages, linking through to related content and documents, services and products, and pages and sub-pages. This acts as a relevance map of your website, tying concepts together to help improve entity understanding and flow of relevance through the website.

Things to look for here include:

  1. Broken internal links – are any links on the website leading to invalid pages?
  2. Anchor text – what words or phrases are being used to link through to other pages, are they the most appropriate for passing relevancy through from one page to the other?
  3. Surrounding text – where is the link being placed in the content, does that provide additional context as to why the link is there, does it provide additional relevance for where the link is pointing to?
  4. Link chains – are any links going through multiple stages, either through http redirects or canonical redirects, or both?

3. Canonical tag setup.

Canonical tags can be a minefield, and if set up incorrectly can prevent key pages from seeing the performance they should. This is particularly common with migration projects, whereby existing canonical tags are copied across even if the site or URLs have changed significantly. But they also appear in everyday reviews.

Things to look for here include:

  1. Chains – are there any instances on the website where a URL features a canonical tag to another URL, which then also features a conflicting canonical tag or redirects elsewhere? This is the single biggest thing to assess and can have monumental impacts on your performance if cleaned up properly.
  2. Alternate status – where you are pointing crawlers to an alternative URL, check the status code and indexability of the destination to make sure it can be accessed correctly.
  3. Source status – for any canonical tag pointing to an alternate URL, check the status of the page it originates on. This can help ensure non-indexable rules, like robots.txt or noindex/nofollow, are not passed on to the destination URL.
  4. Migrations – if you’re doing a migration, or know one has been completed previously, check the canonical tags to make sure they are still pointing to the correct URLs, including http to https, URL structure changes, and pruned content.
  5. Filters and faceted navigations – most common on ecommerce sites but also present on some blog setups, check through each facet to see where the canonical tags are pointing to and whether that is an appropriate page – should a filtered product listing page point to the subcategory or category or department?

4. Faceted navigations.

There is nothing wrong with having faceted navigations on your site, they help users drill into different sections of a website to find what they need. But, they can be a nightmare for crawlers.

One wrong turn and suddenly you’re dealing with a spider trap on the site with infinite URL possibilities being created, and to make it more complex you’ll likely want some of the facets to be indexable but not all of them, so you can’t just apply a blanket rule.

Things to look for here include:

  1. Indexation & crawlability rules – what rules are in place? Are pages being blocked by robots.txt, noindex, URL parameters in Google Search Console, alternate canonical tags? Are they correct?
  2. Specific use cases – are the facets you want indexing set up properly to allow that? Are there any facets specifically you don’t want indexing, can these be removed across the site or are they used in some cases but not others?
  3. URL configurations – do URLs follow the same order when different filters are applied regardless of at which stage? For example, if you select colour + size does it return the same URL if you select size then colour? This can affect crawlability rules as well as canonical tags in place.
  4. Follow or nofollow – are facets set up as nofollow links? Sometimes, this is the only workaround for stopping a spider trap on the site, and one we’ve seen great success from previously, but you’ll need to be sure of the setup you’re working with before you make any decisions to change it – allowing certain facets to be available could lead to them all being available if not monitored correctly.

5. Asset optimisation.

SERPs are getting more visual with more images and more videos appearing for informational queries as well as commercial terms, especially for products. We know that Google is working on rolling out augmented reality shopping capabilities so we know the SERPs are about to get even more visual. Your assets – images, videos – need to be optimised to make sure they can be picked up and displayed.

Things to look for here include:

  1. Alt tags – are appropriate alt tags in place to explain the asset?
  2. Structured data/schema – have you got appropriate markup on the assets? For product images, are they tagged up appropriately? For videos, have you specified content at different parts?
  3. File names – are they appropriate? Do they add relevance to the page they are on rather than simply being “photo-1726429.jpg”?
  4. File size – are the assets appropriately compressed, do they considerably delay page load times? This is going to grow in importance as more emphasis is placed on page experience signals, so the faster you can make the assets load, the better, and that starts with their size.

In summary, just looking at what a tool flags as an issue is going to leave you focusing on what they are saying is a priority, rather than what could actually move the needle for your objectives. There’s always an element of “housekeeping” with technical SEO – making sure your pages can be indexed, title tags are relevant, h1s are properly populated – but those parts rarely form the most impactful changes.

The aspects above can have considerable impacts on your website’s performance and making small changes to these, or completely overhauling them depending what you’re working with, can show significant improvements against your objectives.

Whether you are looking to improve rankings, drive more traffic, or drive more conversions and revenue, all of these will go some way to helping you achieve your goals.

If you need help getting to the bottom of your website’s performance and understanding where the opportunities are and you can capitalise on them, get in touch and our SEO team will be happy to help.