On-page technical SEO

by | 16 Jun 2023

Is SEO relevant in 2023?

Since Chat GPT burst onto the scene at the end of 2022 and with other AI tools such as Google’s Bard starting to launch, there has been a lot of talk that how people search online is about to change forever and as such that search engines as we know them along with SEO are dead.

It certainly is true that AI search tools are going to transform how people search online, but it is important to understand that in 2023 SEO is still very much alive and will remain a vital and relevant part of your website marketing strategy for the foreseeable future.

With this in mind, I thought it would be useful to review what SEO is, what the different aspects of SEO are and why it is so important to the success of your website.

What is it and why you should care

Have you ever wondered how search engines such as Google or Bing decide on the order in which they display organic results for a search you submit?

The secret is Search Engine Optimisation (SEO). SEO is a complex and broad topic but it is essential to the success of a website. A website that does not follow SEO best practice will underperform in search engine result pages (SERPs).

In this article, which is the first in a series that I will be publishing discussing SEO I will be covering technical on-page SEO. You will be able to learn what it is and how you can implement some elements of it to help your website achieve your ultimate goal of first position for your targeted keywords.

It is all about providing a good user experience

The job of a search engine is to provide its users with high quality and accurate results for the search terms that are being submitted. The big search engines are so successful because they understand the importance of a good user experience. If they were to display results containing irrelevant and poor-quality websites users would quickly switch to one of their competitors.

At the core of all search engines there is an algorithm. You can think of this algorithm as the search engine’s brain. It is an eye-wateringly complicated calculation. It takes in a whole series of variables, crunches data and outputs the results to the end user (you) in what is known as a SERP or Search Engine Result Page – this is the list of websites you see when your search query returns results.

No-one outside of the search engine companies (and perhaps not even most inside) truly know exactly what importance is assigned to each of the variables when the algorithm assesses a search term. Nobody can say for definite what the algorithm is looking for or how much weight it places on certain factors. Even if you did know, you wouldn’t know for too long because updates are being released constantly. What you can do however, is to learn and follow best practices, in effect guidelines that have been issued by the search engine companies, which if followed will help your website perform and index well for keywords that you are wanting to target. By following these best practices your site will become optimised for the search engines. The practice of doing this is known as Search Engine Optimisation.

Google bot web crawler
Before we get into more detail on exactly what SEO is, you may be wondering how do search engines know what is on a website. A simplified answer is that search engines have web crawlers that will regularly visit your website and scan it to develop an understanding of what the website is about and who it is for. These web crawlers tend to be referred to simply as ‘bots’. Google for example has two types. Googlebot Desktop and Googlebot Smartphone. Each crawls over a website, simulating either a desktop or mobile device user.

Depending upon how well the website has been designed, built and written and how well SEO principles have been followed dictates how well the web crawlers can understand the purpose of the website that it is visiting.

A well-constructed website built by someone that understands SEO will be able to be discoverable and understood by the search engines and the reward for that is that the site will appear higher up in a search engine result page for a relevant search term than a site that has been poorly constructed and contains poor content.

Keywords are king

Before you can build a successful website or develop successful copy for a site you need to know what keywords to target. Without targeted keywords or with the wrong targeted keywords your site is never going to perform the way you are hoping in the organic search results.

Don’t think of keywords as individual words but more as phrases. They are the search terms that your customers are most likely to type into a search engine when wanting to visit your site.

Deciding which keywords you should be targeting is a complex matter, but it is vital to the success of a website that you develop a correct list of keywords. How you develop this list warrants a separate article which I will be publishing in the next few weeks, so be sure to follow the Willow Leaf LinkedIn and Facebook pages to ensure you don’t miss being notified when the article goes live.

Once you have built a suite of keywords you have taken the first step to a successful website. You now need to apply SEO techniques to allow search engines to target the keywords.

What exactly is SEO?

The first thing to realise is there are different types of SEO. Firstly, there is on-page and off-page SEO.

On-page SEO is all about the SEO contained within your website, whereas off-page is about the SEO for your site on other channels and platforms.

On-page SEO is broken down further into technical SEO and content SEO. It is technical on-page SEO that we are discussing in this article.

Technical SEO is concerned with the structure of a website. There are best practices that should be followed when designing and building a website to help search engines navigate around your site and to provide your site visitors with the best user experience possible.

Key elements of technical SEO

Robots.txt

This is a file found at the root level of a website and is one of the first things a visiting bot will look for when crawling a website. Within robots.txt you can specify files and folders that you don’t wish a bot to index. By disallowing a series of files and folders the bot knows it can skip these elements and focus on crawling pages that you want to index.

You may be thinking why would you have pages on a site that you don’t want to be discoverable by a search engine. Well, if you have a content management system (CMS) site then there will be an admin area of the site which shouldn’t be crawled by a bot. Similarly, for an e-commerce site it wouldn’t make sense to allow a bot to crawl the basket page or the checkout page.

Another important role robots.txt has is to point the bot in the direction of your sitemap.xml file.

Sitemap.xml

This is another file that should be placed in the root of your site. A bot will have been guided to it by your robots.txt file. It acts a bit like a road map directing the bot to visit all the pages you wish to be crawled. A missing or incomplete sitemap can result in pages not becoming indexed or taking longer than expected to be indexed.

Metadata

There is a good chance you have heard of metadata before, after all it is not a new concept. Metadata simply means data about data. Prior to being digitalised, libraries would use index card catalogs that would allow you to search and locate a book – each book in the library would have its own index card that would contain metadata – information about the book such as: author, title, publisher etc.

A web page requires similar data for SEO. There are quite a few pieces of metadata that you can use, but the most important are meta title and meta description. The meta title is what appears in the tab in you browser when you are viewing a page. The meta description is what you see displayed in a SERP result. It is usually about 150 characters long and should summarise the content of the page.

Both should include the principal keyword you are wanting to target for the page. Both are important in helping a bot understand what a page is about, but they both are also important to users scanning a SERP result. The better written your title and description metadata is the better click-through rate (CTR) your page will achieve.

Load times

Visitors to a site expect a page to load quickly, if they are left waiting for a page to appear there is a very real chance that they will leave. Search engine bots know this too. If your page is taking more than a couple of seconds to load then the page would not be considered as providing a good user experience. The result will be a page that performs poorly in search engine rankings.

There are many techniques an experienced web developer will use to ensure fast load times. This is an entire topic in itself and outside the scope of this article, but will include: writing efficient code, optimising image sizes, minifying CSS and javascript code and not using unnecessary and inefficient plugins.

A secure site

A major ranking factor in search engine algorithms is security. To stand any chance of ranking well a site must be secured with HTTPS. This is necessary in protecting the content of data that flows between a website and user’s browser. An SSL certificate is required for HTTPS activation.

Mobile friendly

The majority of visitors to your site will be using a mobile device. For a number of years now search engines have recognised the importance of optimising a site for mobile. They now have a mobile first policy in recognition of just how prevalent mobile use has become. It is therefore essential that your website works flawlessly whether you are viewing on a large desktop screen, a laptop, tablet or mobile device.

Duplicate content

You’re probably thinking surely this should be covered under content SEO, and you’re right it should, however there is also a technical SEO element to duplicate content that may not seem very obvious.

As a human, when you look at a website, if the URL has its www or no www prefix or if it ends with a trailing slash or if it has some kind of querystring at the end, it doesn’t really matter, you look at it and you can see it is all the same page. However, when a search bot views your site it would infer that they are all different pages.

For example, the following URLs may all display the same page, but to a bot they are all different:

www.mysite.co.uk
mysite.co.uk
mysite.co.uk/
mysite.co.uk?id=1

The bot will therefore conclude that all these ‘pages’ feature duplicate content with each other. The result will be a weakened SERP presence.

You can overcome this with a technical SEO feature called canonicalization. By assigning a canonical URL to a page you are informing the bot which is the URL that should be indexed for all the variations it may encounter.

Broken links

If your site has any broken or dead links then you can be confident that a search engine bot will find them. A broken link does not make for a good user experience and as a result a page will be punished in the rankings. It is therefore important that they are avoided. A good developer will also put procedures in place so that if there is a broken link it is handled by the site in a way that doesn’t disrupt the user experience too much. This can be done by building into the site a special page known as a 404 page. The name refers to the error code that is triggered when a browser tries to call an unknown URL within the site. The 404 page avoids the user seeing a confusing error message, and instead should provide the user with links back to the main sections of the site.

In conclusion

I’ve covered quite a lot there. It is by no means an exhaustive list but I hope it has served to give an understanding into the importance and complexities of on page technical SEO. A site built without a solid SEO foundation is likely to fail in driving traffic to it via search results. People will not organically find it and as a result site traffic will be low. But if built by someone who understands SEO, then as a site becomes established it will be set for success.

It is true that you can pay Google or Bing to appear prominently on a search results page. This is known as Search Engine Marketing or SEM and requires you to set up and run a pay per click (PPC) campaign. When running such a campaign you will be charged whenever a user clicks to visit your site from a sponsored link. The amount you pay is determined by the popularity of the keyword you are targeting. The more popular it is, the more money you will need to spend to bid and compete against your competitors to win the position of one of the sponsored links.

If you have the budget then search engine marketing can prove very successful, but the costs can also quickly escalate. A site with strong SEO for a targeted keyword can rank organically for free.

Hopefully this short post has helped to give an understanding of what SEO is and just why it is vital to the success of a website.

If you have a small business and are looking for a new website, it is so important that from the very beginning SEO is at the heart of the site build and the on-going running of a site. If you would like to discuss how Willow Leaf Web Solutions can help and create an SEO optimised site for your business then drop me a message to arrange a free, no obligation chat.

Follow Willow Leaf Web Solutions on LinkedIn and Facebook to ensure that you don’t miss receiving notifications of newly published articles.

Recent Posts

Email Sending Issues

Email Sending Issues

A real-world troubleshooting guide for independent professionalsWhen a client's business email suddenly stopped sending messages just as they were trying to reach urgent contacts, they were understandably panicked. They didn't want to resort to using a Hotmail account...

WordPress Plugins

WordPress Plugins

Taking Control of Your WordPress Plugin UpdatesIf you run a WordPress website, you're in good company. WordPress powers over 43% of all websites globally, that's a staggering 541 million sites. If your site is one of the 541 million then you’ll already appreciate the...

WhatsApp Business

WhatsApp Business

A guide to using WhatsApp Business as a live chat tool on your websiteI’m sure you will have heard and probably used WhatsApp Messenger. It is perhaps the most popular instant messaging tool. Most smart phone owners use WhatsApp to message friends and family on a...