The Ultimate Technical SEO Checklist (2024 Edition)

DALL% C %  B E
Chad Peterson
May 3, 2024

The first step in a comprehensive SEO strategy is improving your technical SEO.

No matter what industry your brand or company is in, the principles of technical SEO have never been more important. Ensuring your website is technically sound increases organic traffic, ranking keywords, and conversions.

Here is your ultimate checklist to ensure you’re putting your best technical SEO foot forward.

1. Improve core web vitals

Google’s Core Web Vitals are critical metrics that assess a website’s overall user experience and influence rankings. Here’s a breakdown of each metric.

Largest Contentful Paint (LCP) measures the time it takes for the biggest element on the page to load (i.e., the largest picture). To provide a positive user experience, this should happen within 2.5 seconds.

Interaction to Next Paint (INP) measures user interface responsiveness. INP replaced First Input Delay (FID) as part of Core Web Vitals on March 12, 2024. Acceptable INP reports are <=200ms.

Cumulative Layout Shift (CLS) measures the visual stability of elements on the screen. Sites should strive for their pages to maintain a CLS of less than .1 seconds.

You can measure these ranking factors in a Google Search Console report, showing which URLs have potential issues.

Here are the performance ranges for each status:

Good Needs Improvement Poor
LCP <=2.5s <=4s >4s
INP <=200ms <=500ms >500ms
CLS <=0.1s <=0.25s >0.25s

Tools and tips for improving core web vitals

There are plenty of tools to help you improve your site speed and Core Web Vitals. The go-to choice is Google PageSpeed Insights. Webpagetest.org is also a helpful place to check how fast different pages of your site are from various locations, operating systems, and devices.

Optimizations you can make to improve website speed include:

  • Implementing lazy-loading for non-critical images
  • Optimizing image formats for the browser
  • Improve JavaScript performance

2. Replace intrusive interstitials with banners

Intrusive interstitials are elements that obstruct the view of the primary content (i.e., website pop-ups). Web managers frequently use them for promotional purposes.

Google recommends avoiding pop-ups for sales promotions or newsletters. They frustrate users and erode trust. Instead, Google suggests carefully placed banners.

Similarly, it’s essential to keep from overloading pages with ads, as they negatively impact E-E-A-T signals and tank the user experience.

3. Ensure content displays well on mobile devices

Google says pages must load quickly, be easy to navigate, and be easy to take action on to work well on mobile devices. Ensure pages use a responsive design that adjusts for different screen sizes. Review image sizes and quality to improve page speeds. Improve menus, breadcrumbs, internal links, and contact buttons to improve navigation.

4. Review safe browsing site status (HTTP vs HTTPS)

Google announced the HTTPS protocol as a ranking factor in 2014. If your site’s pages still use HTTP, it’s time to add an SSL or TLS security certificate

HTTPS protects visitor data, ensures encryption, and safeguards against hackers and data leaks. It enhances web performance and supports new features like service workers, web push notifications, and existing features like credit card autofill and HTML5 geolocation API, which are not secure with HTTP.

Leverage Google’s Safe Browsing site status tool to review your site’s safe browsing status. The HTTPS report shows the number of your indexed pages that are HTTP vs. HTTPS.

5. Include signed exchanges for faster page loads

Signed exchanges (SXG) enable Google Search to prefetch your website’s content while maintaining user privacy.

Prefetching—which includes key resources like HTML, JavaScript, and images—helps render web pages quicker when a user clicks on search results. Faster rendering improves the Largest Contentful Paint (LCP) score, enhancing the overall page experience.

Before implementing SXG, analyze your website’s traffic and loading performance to identify pages or resources that frequently slow down user experience, indicating a need for caching.

Review and optimize the SXG-cached resources based on performance data to ensure quick load times and a better user experience.

6.  Look for crawl errors

Ensure your site is free from crawl errors. Crawl errors occur when a search engine tries to reach a page on your website but fails.

Use Screaming Frog or other crawl tools. Once you’ve crawled the site, look for crawl errors. You can also check this with Google Search Console.

When scanning for crawl errors, you’ll want to:

  • Go through error pages (e.g., 401, 404, 500) to figure out where to redirect them
  • Correctly implement all redirects with 301 redirects
  • Find and fix redirect chains or loops

7. Fix broken links

It’s frustrating for people to click a link on your website and find that it doesn’t take them to the correct—or working—URL. Bad links negatively impact the user experience for humans and search engine optimization. This scenario applies to internal and external links.

Check for the following optimization opportunities:

  • Links that go to an error page (e.g., 401, 403, 404 error codes)
  • Links with a 301 or 302 that redirect to another page
  • Orphaned pages (pages without any links to them)
  • An internal linking structure that is too deep

To fix broken links, update the target URL to the new working page. If the original content no longer exists, remove the link and implement a redirect. 

8. Get rid of duplicate content

Make sure there’s no duplicate content on your site. Duplicate content can be caused by many factors, including page replication from faceted navigation, having multiple versions of the site live, and scraped or copied content.

You must only allow Google to index one version of your site. For example, search engines see all of these domains as different websites rather than one website: https://www.abc.com, https://www.abc.com, https://www.abc.com, https://www.abc.com, https://abc.com. If your preferred version is https://www.abc.com, the other three versions should 301 redirect directly to that version.

Fixing duplicate content can be implemented in the following ways:

  • Set up 301 redirects to the primary version of the webpage
  • Implement no-index or canonical tags on duplicate pages
  • Set up parameter handling in Google Search Console
  • Set the preferred domain in Google Search Console
  • Where possible, delete duplicate content

9. Make sure URLs have a clean structure

Overly complex URLs can cause problems for crawlers by creating unnecessarily high numbers of URLs that point to identical or similar content on your site. As a result, Google may be unable to completely index all your site’s content. Google suggests a website’s URL structure should be as simple as possible.

URL issues Google says to avoid:

  • Session IDs or other unnecessary parameters
  • Non-readable characters (e.g., Unicode or emojis)
  • Foreign languages not using UTF-8 in URLs
  • Underscores instead of hyphens
  • Overly complex structures
  • Dynamic generation

10. Setup and optimize XML sitemap(s)

XML sitemaps tell search engines about your site structure and what to index in the SERP.

An optimized XML sitemap should include the following:

  • Only 200-status URLs
  • Any new content added to your site
    • E.g., recent blog posts, products, etc.
  • No more than 50,000 URLs
    • Sites with more URLs should use multiple XML sitemaps to maximize crawl budgets

You should exclude the following from the XML sitemap:

  • URLs that are 301 redirecting or contain canonical or no-index tags
  • URLs with 4xx or 5xx status codes
  • URLs with parameters
  • Duplicate content

You can check the Index Coverage report in Google Search Console to see if your XML sitemap contains errors.

11. Optimize the robots.txt file

You don’t want every page on your site to appear in the search results. Robots.txt files are instructions for search engine robots on which pages to crawl and index your website. Here are some example URLs you should disallow in your robots.txt file:

  • Admin pages
  • Temporary files
  • Search-related pages
  • Cart & checkout pages
  • URLs that contain parameters

Confirm your robots.txt file isn’t blocking anything you want to be indexed.

Robots.txt is particularly useful for large websites (5,000+ pages) that need to manage their crawl budget—the maximum number of pages a search engine allocates to crawl a site within a specific timeframe.

Correctly configuring the robots.txt file allows large website managers to prioritize indexing their most important pages and fosters more efficient use of crawl budgets.

Pro tip:

The robots.txt file should include the location of the XML sitemap. You can use Google’s robots.txt tester to verify that your file works correctly.

12. Add structured data and schema

Structured data is code that pro helps provide information about a page and its content. It gives Google context about the meaning of a page and helps your organic listings stand out on the SERPs.

Structured data influences your chance of winning the featured snippet at the top of the SERPs. There are many different kinds of schema markups for structuring data for people, places, organizations, local businesses, reviews, and so much more.

You don’t need to know how to code, you can use online schema markup generators. Google’s Structured Data Testing Tool can help create schema markup for your website.

Win featured snippets and PAA with headings, lists, and tables

We don’t need schema code to win the rich snippet. Google uses headings (H2,s H3s,…), lists (numbered or bulleted), and tables to improve the richness of the results they display (rich results).

Adding a number in the front of every H2 or H3 for a list-style article helps Google populate the featured snippet and improves the chances of winning the People Also Ask queries.

Ensure list items are shorter than a sentence long and stick to one numbered list per article. Leverage tables for listing technical data or for versus and comparison style content.

13. Review site health regularly

Even small website changes can cause technical SEO site health fluctuations. Internal and external links break if the anchor text is changed on your internal site or the website you’re pointing to.

New website pages or organization, site migrations, and redesigns do not always transfer over important SEO aspects like schema markup, sitemaps, and robots.txt—or move them to a place that Google will not recognize.

Make a plan to run a crawl of your site and look at each aspect of this checklist any time significant changes are made to your website and on a regular schedule (at least every 12 months) to make sure a technical SEO issue doesn’t disrupt organic website traffic.

Are you looking for a partner who specializes in technical SEO? We’re here to help!

Contact Perfect Search Media to ensure your website is set up to appear at the top of organic search results.

Recommended articles (helpful links)



DALL% C %  B E

Ready to Take the Next Step?

Our digital marketing services will take your business to the next level.

Start your journey with a free site audit.