Leveraging Google Search Console’s New Indexation Coverage Report

Digital Olympus Recap: Leveraging Google Search Console’s New Indexation Coverage Report

Digital Olympus Perfect Search Recap
September 20, 2018

Perfect Search’s very own Director of SEO & Content Justin McIntyre presented during today’s fall 2018 Digital Olympus conference. Sad you missed it? Don’t worry—we’ve put together a quick little SparkNotes summary for you.

Digital Olympus is a free online digital marketing conference. Their fall 2018 conference took place on September 20th, 2018 and hosted a variety of digital marketing experts. Justin spoke on Google Search Console’s new indexation coverage report. Here’s what you missed.

 

What is the new GSC indexation coverage report?

It’s a new tool that helps marketers comprehensively view indexed pages and see a big-picture analysis for pages on your site that aren’t being indexed or crawled.

The new report is especially helpful since it’s information coming straight from Google itself; in the past, you might’ve had to make educated assumptions on your site’s indexation.

You can choose how you want to review coverage errors. The report can review all known pages (the most comprehensive review of all pages on your website that Google can detect), all submitted pages (all pages included in your XML sitemaps), or a specific XML sitemap.

For more info on the report, check out Stephanie’s summary.

 

What are the four status types?

1) Error

The error status applies to any page unable to be indexed by Google. This could be affecting important pages, which could lead to a loss of keyword rankings and traffic to your site.

Error pages could mean that you have soft 404s, 404s, or other crawl issues like blocked resources. If you have a lot of 404 pages, look into resolving these 404s or removing the URLs from your sitemap.

 

2) Valid with warnings

This error status means that these pages have been indexed, but they might not actually be desired by users. For example, a page could be indexed but blocked by your robots.txt file. This could happen if Google sees that a certain page has a lot of internal links or high-value backlinks, so it’s indexing it—even if your robots.txt says not to.

Confusing signals or mixed messages cause this status type. Look into the valid with warnings type to review the conflicting signals.

 

3) Valid

This means that your URLs have been successfully crawled and indexed. Nice work!

Generally speaking, this is a good thing. However, you should still pay attention to this status type. If you see a dramatic increase or decrease in this volume, something fishy could be happening.

 

4) Excluded

This applies to URLs that are intentionally not indexed by webmasters. This applies to 404 pages, duplicate pages, or pages with redirects. You want to carefully review these pages and make sure that the signals that Google is receiving are accurate and purposeful.

 

So what does this mean for marketers?

The new indexation coverage report gives you more insights—straight from Google—on how Google sees your site.

Plus, you can share a link from the report directly to other parties like developers or decision-makers. They don’t need to log into GSC to view the link, so it’s an easy way to share updates.

If you haven’t taken advantage of the new report, there’s no time like the present!

Senior Manager, Copywriting & Content Strategy
Kayla Hammersmith is a huge fan of Nancy Drew computer games and swears that she can do a very specific impression of Pal, the dog from Arthur. You might often find her snacking on goat cheese as she dreams of one day becoming a cellist savant.

Ready to Take the Next Step?

Our digital marketing services will take your business to the next level.

Start your journey with a comprehensive site audit.


Stay in touch with digital marketing & those who know it best.
Sign up for the Perfect Search newsletter.