A 5-Step Technical SEO Checklist for 2018
Building a strong technical foundation is the backbone of any successful SEO strategy. 2018 is a good time to take a step back and look at how your website’s organic performance can be improved.
Need some guidance? Here’s a handy 5-step technical SEO checklist for 2018.
Step 1) Crawl your site and look for any crawl errors
The first step to ensuring your site is technically sound is to crawl it. You can use Screaming Frog, Deep Crawl, seoClarity – there are many tools out there to help you do this. Once you’ve crawled the site, look for any crawl errors. You can also check this with Google Search Console.
When scanning for crawl errors, you’ll want to a) ensure that all redirects are properly implemented with 301 redirects and b) go through any 4xx and 5xx error pages to figure out where you want to redirect them to.
Step 2) Make sure your URLs have a clean structure
Straight from the mouth of Google: “A site’s URL structure should be as simple as possible.” Overly complex URLs can cause problems for crawlers by creating unnecessarily high numbers of URLs that point to identical or similar content on your site.
As a result, Googlebot may be unable to completely index all the content on your site.
Here are some examples of problematic URLs:
-Sorting parameters. Some large shopping sites provide multiple ways to sort the same items, resulting in a much greater number of URLs. For example:
-Irrelevant parameters in the URL, such as referral parameters. For example:
Where possible, you’ll want to shorten URLs by trimming these unnecessary parameters.
Step 3) Ensure your site has an optimized XML sitemap
XML sitemaps tell search engines about your site structure and what to index in the SERP. An optimized XML sitemap should include:
-Any new content that’s added to your site (new blog posts, products, etc.).
-Only 200-status URLs.
-No more than 50,000 URLs. If your site has more URLs, you should have multiple XML sitemaps to maximize your crawl budget.
You should exclude the following from the XML sitemap:
-URLs with parameters
-URLs that are 301 redirecting or contain canonical or no-index tags
-URLs with 4xx or 5xx status codes
Step 4) Make sure your site has an optimized robots.txt file
Robots.txt files are instructions for search engine robots on how to crawl your website. Every website has a “crawl budget,” or a limited number of pages that can be included in a crawl – so it’s imperative to make sure that only your most important pages are being indexed.
On the flip side, you’ll want to make sure your robots.txt file isn’t blocking anything that you do want to be indexed.
Here are some example URLs that you should disallow in your robots.txt file:
-Cart & checkout pages
-URLs that contain parameters
Finally, you’ll want to include the location of the XML sitemap in the robots.txt file.
Step 5) Get rid of any duplicate or thin content
The last item on our checklist is to make sure there’s no duplicate or thin content on your site. Duplicate content can be caused by many factors, including page replication from faceted navigation, having multiple versions of the site live, and scraped or copied content.
Fixing duplicate content can be implemented in the following ways:
-Setting up 301 redirects
-Implementing no-index or canonical tags on duplicate pages
-Setting the preferred domain in Google Search Console
-Setting up parameter handling in Google Search Console
-Where possible, deleting any duplicate content
Now that you’ve gone through the checklist, you can start rolling out any technical changes you need to make for the year. Happy auditing!
Looking for more SEO advice? Check out these posts: