Technical SEO elements that will help your site rank

Technical SEO ranking factors

SEO Ranking Factors Series #2: Technical Optimisation.
In a series of posts, we look at some of the key factors that Google and the other search engines take into consideration when deciding on where to place a site in the organic results for a given search query. This time, it’s technical factors and the difference they can make.

The importance of technical elements as ranking factors

A firm grasp of the technical elements of SEO, and the principles behind them, forms the basis of any successful organic campaign. If these elements are not implemented effectively, then other key factors in SEO such as content and outreach naturally cannot perform to their full potential. Googlebot will be unable to give full credit for the relevance and authority of your web-pages for given keyword searches if your domain is not technically compliant.

Given the complexity of search engine algorithms, and that they are continually subject to change, it’s essential for webmasters to maintain a clear understanding of those features that are known to affect positioning on SERPS, and also those which while not thought to directly impact rankings can also help with SEO.

Multiple studies of high-ranking sites indicate that the best performers will always have certain technical aspects in common. Below we discuss the crucial elements that should be borne in mind, and current best practices when implementing them…

Technical optimisations

Architecture and crawlability

If we think of the internet as a gigantic book, then for a website to feature in the web index, and thereby appear in the rankings, ‘spiders’ such as Googlebot and Bingbot need to be able to crawl it and determine its relevance for particular queries. If they can’t find their way around easily, the site won’t rank – it’s as simple as that, so ensuring that your website has a crawlable architecture is very important for boosting search rankings.

Submitting a sitemap, the correct use of canonicalization, 301 redirects and building an effective internal link structure can all help boost crawlability. Note that if a site uses Flash or Javascript, these can impede spiders from making their way around by making any links or HTML embedded within them unparseable. While Googlebot’s ability to crawl Javascript content has improved in 2016, it’s still better to be safe than sorry.

The internal linking structure

As discussed above, a website’s visitors aren’t always human, and an effective internal link structure, as well as making it easy for someone to navigate a website and easily find the information they need, also helps search engine spiders to know what you consider to be of importance and where it is located.

By making it easy for the spiders to index your pages and assess the content of each, you help your rankings, spreading domain authority and ranking power across your website. Aim to link deep into the website in natural directions, to valuable content that you want people to discover, rather than to your Home or Contact pages. Utilise follow links and anchor text, embedding them naturally in the body text, and use your internal links sparingly – while it’s not necessarily detrimental, adding dozens of links won’t have much more effect in terms of ranking than a handful.

Make the internal links as natural as possible – exact match anchor text is preferable to a generic ‘Click here’ and is a major ranking signal yet it’s something that Google has been looking closely at recently.

optimising technical elements

The site is HTTPS

In long-form, Hypertext Transfer Protocol Secure, or HTTPS, effectively means that any communications between a person’s web browser and a website bearing that prefix in its URL are encrypted and therefore secure.

Google has been at the forefront of pushing for a HTTPS web and while it has itself said that HTTPS is only a low-strength ranking signal at present, it’s quite possible that it will grow in importance in the future.

As such, currently HTTPS has only a low correlation with organic performance and since implementation can be both costly and complex, take-up is a long way from complete. But it does result in greater security and user confidence, and webmasters can restrict it to certain pages such as registration and checkout if they want, to make installation easier and reduce the strain on resources. There are also techniques available that can prevent HTTPS from slowing page loading speeds, which is vital especially for major ecommerce websites. For further detail on HTTPS, see our blog post Making the Leap: Should You Kermit to HTTPS?

Existence of a meta description

Writing meta descriptions won’t directly affect your search rankings but they are very useful for improving click-throughs from the SERPS.

These snippets that describe the content of a page are what appear on the SERPS, like organic advertisements, and so they need to be correctly optimised. Aim for 150-160 characters – too long and your sentence will be cut off, too short and it doesn’t look right, denting user confidence. Add a persuasive description, smart deployment of keywords (which Google helpfully bolds to make them stand out), and of course pay attention to relevance to the page content.

It’s preferred that every meta description should be unique from all the others you use, so if you have a lot of pages to describe, you might consider just leaving them blank and letting the search engines pull through content from each page to create their own.

ranking factors

The site speed

There are many things that are important to Google – web security, not being evil, and also, of course, the user experience, or UX for short. An interesting part of that is page-loading speeds which can be bogged down by a wide variety of factors – such as, potentially, HTTPS or video content that plays automatically. Slower website loading speeds – even fractions of a second, can cause users to click away in frustration, and so they are picked up on by search engines.

Time To First Byte – the duration until the first response is received from the server, has the potential to impact rankings if competing websites are otherwise equal. A fast internet connection helps, but it’s far from the end of the matter. You can consider dedicated hosting for your site, and look at optimising your back-end infrastructure and software so that requests get through your network to your servers and data goes back as fast as possible. Content Delivery Networks (CDNs) are a common way of helping with latency and they are especially useful for companies that need a web presence in multiple countries and want to maximise site speed.

Note that the actual page loading speed, according to one Moz study, does not seem to correlate directly with search rankings, but given Google’s interest in UX, it would be unwise to assume that won’t change at some point.

The keywords are in the domain

Keyword-rich domains, such as buildawebsite.com, can certainly help with rankings but they will understandably also face greater scrutiny from the search engines that are looking to confirm the quality and relevance of their content.

Best practise calls for balancing the benefits of having a keyword in the domain name with ensuring that it remains catchy and easy to type for the user. Exact Match Domains, if you can afford to buy the one you want, can still have considerable value, but never rely on them as a way to increase traffic at the expense of other tactics.

If getting a keyword-rich domain is going to involve using multiple hyphens or using a lesser-known Top Level Domain, then it’s probably not worth it.

The use of Flash

Again looking at the user experience, Flash is an effective device for grabbing the visitor’s attention quickly and conveying information in a snappy format, yet it can work against a site’s performance. Flash sites often have unconventional navigation that can impede users as well as being slower to load.

If you want to incorporate Flash into your site but also help your rankings at the same time, then you shouldn’t use it for site navigation. Create XML Sitemaps for easy crawling and use HTML for the most important sections of your website.

The domain is a .com

While your site’s Top Level Domain, as Google allows, probably won’t affect your rankings, it’s thought that newer formats such as .travel or .gold, for instance, are inviting search engines and users, to consider them a little spammy.

They can offer some advantages, such as in differentiating your site from the competition or allowing you to buy a memorable web address without spending a fortune on it, but you should always take into account the navigation preferences of web users – they are far more likely to click on a .com or .co.uk.

Trickery such as having a domain name completely unrelated to a site’s content will get nowhere – there are now very few shortcuts with search and those that remain are being steadily picked off.

technical elements

By now if you’re a webmaster or work in SEO you’ll be long familiar with the term that Content is King and while it’s difficult to overstate the importance of high-quality content, overlooking the technical elements of a website, or worse, getting them wrong, can render all of your efforts for naught.

In order to rank, the website needs to be readied for crawling, and the weight that Google and other search engines apply to different technical aspects is forever shifting, so the work is never done. But then, isn’t the quest for constant learning and improvement what makes SEO so much fun anyway?


If you’re interested in how Found could supercharge your digital performance across SEO, PPC, Social and Digital PR, then get in touch today to speak to a member of the team.