4 Steps to Kick Start Your Technical SEO

4 steps to kick start your technical SEO

Have you ever felt intimidated whenever “Technical SEO” was brought into the conversation? So have I. But then I slowly started to take the bull by the horns and realized one major thing: you don’t need to have technical skills to understand the technical part of SEO.

In fact, technical SEO stands for all of your optimization efforts done aside from the content and it strictly refers to how well your website is indexed and ranked for specific keywords. This can be achieved by paying close attention to page speed or crawl errors as well as other criteria that search engines look for.

Let’s examine a few of the most common factors often seen as part of technical SEO and find out how they can affect your site’s ranking ability:

1. Ensure site and page speed

People often get these two mixed up, but site speed and page speed (also known as “page load time” or “time to first byte,”) are not the same however, both are equally important. Site speed is the speed for a sample of page views on a site. According to Google, site speed (and consequently page speed) is an important factor for ranking your content in SERPs.

There’s tons of research that show the speedier your site is, the better your results are. To check the page speed, as well as other important factors, I use SEO Site Checkup or Google Page Speed Insights. They even tell you how to fix your potential issues (you might need the help of a developer here but at least you know what to do.)

Check it out for yourself

Head over to SEO Site Checkup, enter the URL of your blog and hit “Checkup.” Your score along with a list of recommendations will show up. See the sample-analysis for my site below.

SEO Site Checkup Score

Scroll through all the recommendations to “Site Loading Speed Test.”

Is your speed under 3 seconds? Then, you’re in good shape. Not? You’ve got work to do.

2. Go over your Robots.txt file

Robots.txt is a file used to give instructions about your site to web robots; this is done through something called The Robots Exclusion Protocol.

So how does it work. In order for your page/content to rank in search engine results, it has to be indexed first. So a robot (bot, spider or crawler) comes to your site to crawl your pages and decide which rank to assign. But before it does so, a bot will first check your robots.txt file by accessing it at: http://www.example.com/robots.txt.

This file tells the bot what pages to crawl and which ones to leave out. If you don’t have a robots.txt file, the bots may not find all of your pages and may leave some content out.

If you want to dig more into it read this super cool and easy to understand article about what robots.txt file.

Check it out for yourself

Look for your robots.txt file. Go to www.yoursitename.com/robots.txt. Do you see something that looks like this:

robots txt example file

Image from CompleteWebResources.com

If yes, check it out and see which files are allowed to be crawled and which ones are not. Can't find anything that comes close to a robots.txt file? You probably don't have one so consider adding it.

3. Make sure there’s an XML sitemap

A sitemap is an XML or HTML document, containing a list of pages on your website that you want Google and other search engines to index. However, that alone does not guarantee that Google will actually index all of your pages.

So what’s the difference between an XML and HTML sitemap? The main difference is that XML sitemaps are written for search engines while an HTML one is written for humans.

A search engine spider can quickly and easily extract all the important pieces of information about your site by looking at the XML file. An HTML sitemap enables lost human users to find a page on your site that they are looking for.

If your blog is on WordPress you can download the Google XML Sitemaps plugin and generate your sitemap automatically. Other platforms, like Wix, generate it automatically so you don’t need to worry about that.

Check it out for yourself

Look for your sitemap by accessing https://yoursitename/sitemap.xml

The page can’t be found? Oooops, no sitemap. Better make sure you have one in place.

4. Check Your Google Search Console

The Google Search Console, formerly known as Google Webmaster Tools, is a service similar to Google Analytics that monitors the search results data for your website. If you haven’t configured it yet, you probably should. No worries, it only takes about 5 minutes to set it up. To do it head over to A Beginner’s Guide to Google Search Console.

So what’s so special about GSC? For starters, you can see how many pages of your website were indexed or if there are crawl errors. Google loves a site that’s error free, so if you have many 404s or 302s that’s gonna be a problem. The most common errors are:

404 – the page Googlebot tried to crawl doesn’t exist anymore

This can happen when you move your site to a new domain and the old URLs are still indexed. But because the content moved to a new domain those old URLs don’t exist anymore. This usually solves by itself once Google indexes your new pages. But if this is not the case, you’ll have to fix your 404s. How? Read it on Yoast.

302 – temporary redirects

Redirects can be permanently (301) or temporary (302.) A permanent redirect is when you need to direct users from, let’s say MySite.com to MyNewSite.com. You may need to do that when you move your site to a different domain.In this case, a permanent redirect tells the search engines that the old location is to be removed from their index and replaced with the new location. Using 301 redirects is the most search engine friendly way to redirect traffic and engines without losing existing rankings.

With 302 redirects you’re telling the search engines to look at the content on the new page, but to keep checking the original URL as it will be reestablished shortly.

All sounds clear up to now. The problem is when you’re using 302 redirects instead of 301, and that tends to happen quite often. If you set the wrong type of, search engines may become confused, resulting in loss of traffic. Read more about redirects and learn when to use them.

Check it out for yourself

Go to your Google Search Console account and check out your errors.

Google Search Console Crawl Errors

These are just a few things to get you started. If you have any questions or comments related to this article or any Technical SEO challenge just send them over. I'd be more than happy to help and support you however I can.

#SEO #TechnicalSEO

Related Posts

See All
Get more skin care articles, exclusive blogging resources and all the beauty karma you need to grow your blog. Straight to your inbox!
Did You Like This Post?
You Might Also Like