Posted 5 August, 2020 by in SEO

How Many Sites Pass the Core Web Vitals Assessment?

Speed has long been an official ranking factor, but with the introduction of the Core Web Vitals (CWV), many an SEO might have noticed the ominous Pass/Fail assessment within PageSpeed Insights.

PageSpeed Insights Assesment

While these metrics aren’t yet used in Googles algorithm, I saw so many URLs failing, it got me wondering. How many well-ranking URLs end up passing the assessment?

2,500 keywords, 20,000 URLs, and just as many graphs later, I may have found the answer.

TL;DR – Across 20,000 URLs:

  • Only 12% of Mobile and 13% of Desktop results passed the CWV assessment (i.e. considered good in all three metrics).
  • First Input Delay (FID) on Desktop is negligible with 99% of URLs considered good. And 89% for Mobile .
  • 43% of Mobile and 44% of Desktop URLs had a good Largest Contentful Paint (LCP).
  • 46% of Mobile and 47% of Desktop URLs had a good Cumulative Layout Shift (CLS).
  • URLs in Position 1 were 10% more likely to pass the CWV assessment than URLs in Position 9 .

 

Methodology

As Core Web Vitals are evaluated on a per URL basis, I took 2,500 keywords across 100 different topics, scraping all the first-page organic results of each. In total I ended up with about 22,500 URLs. This was duplicated for both mobile and desktop results.

These where then run through the SEO Spider connected to the PageSpeed Insights API , gathering the necessary PSI & CrUX data.

A couple of caveats:

  • All results were scraped from a search in Berkshire, UK.
  • No rich result URLs where included.
  • 10th position is excluded as so few SERPs had 10 organic listings, making the sample size considerably lower.
  • A handful of results had featured snippets. These are classified as position 1 but may not be the ‘true’ 1st position.
  • Some sites appeared across multiple rankings (e.g. Wikipedia)
  • Several URLs could not be analysed in PSI for various reasons.

A Bit on Core Web Vitals

For anyone reading who might not be aware of Core Web Vitals – they’re three metrics Google will use to judge page experience. And will become an official ranking factor some time in 2021.

Why? To help push the web forward, encouraging site owners to provide better experiences for users – Aaaand likely helping Google to render the web a bit quicker and more efficiently at the same time. Win-Win.

They’re recorded using real user metrics (rUM) from the Chrome User Experience Report (CrUX). (Google search may also use lab data where CrUX is not available, but the analysis below focuses on rUM). PageSpeed Insights (PSI) then reports on the 75th percentile of this data (25% slowest loads), and classifies them by the following thresholds:

Core Web Vital thresholds
  • Largest Contentful Paint (LCP) : measures loading performance. To provide good user experience, LCP should occur within 2.5 seconds of when the page first starts loading.
  • First Input Delay (FID) : measures interactivity. To provide good user experience, pages should have an FID of less than 100 milliseconds.
  • Cumulative Layout Shift (CLS) : measures visual stability. To provide good user experience, pages should maintain a CLS of less than 0.1.

To pass the Core Web Vitals assessment, a URL needs to be considered ‘good’ in all three metrics.

What Did the Data Highlight?

As suspected only a small proportion of sites ended up passing the CWV assessment – shock! From our list of URLs, only 12% mobile and 13% desktop  passed the CWV assessment.

Excluding those without rUM brought this to 23% and 24% respectively.

What’s more interesting is looking at individual pass rates for each ranking position:

Core Wb Vital Pass Rate by Position

URLs in Position 1 had a pass rate of 19% on Mobile and 20% on Desktop . Moving from 1st to 5th saw a 2% decrease per position. The remaining results from 5-9 flattened out to a pass rate of around 10% on Mobile and 11% on Desktop .

So what’s going on here? Have CWVs been a top-secret ranking factor all along?

Very unlikely, but perhaps not far from the truth. From what I’ve noticed it tends to boil down to two aspects-

A major part of the CWV assessment focuses on load speed, which we know is already a ranking factor. Therefore, logic would suggest that quicker sites may rank slightly higher and end up passing the assessment in turn.

However, Google continually comments that speed is a minor factor. Instead, I suspect sites ranking in the first 1-4 positions tend to be better optimised overall. With targeted, rich, and user-friendly content. All while loading this information more efficiently.

Breaking Down the Vitals

We can also view the individual metrics on a more granular level. The following table shows classification across the whole URL sample:

Core Web Vital Breakdown

First Input Delay

The FID is negligible, with 89% of Mobile and 99% of Desktop URLs within the good threshold. Averaging at around 56ms on Mobile and 13ms on Desktop .

When comparing against position we get much less of a correlation:

FID by Position

Largest Contentful Paint

LCP saw 43% of Mobile and 44% of Desktop URLs considered good. This averaged out at 3.13s for Mobile and 3.04s for Desktop .

When compared against position, we can see a slight trend. But only 0.14s difference between 1st and 9th :

LCP by Position

We can also see this reflected in the pass rates (considered good) for each position:

Mobile LCP Breakdown Desktop LCP Breakdown

Cumulative Layout Shift

The CLS pass rates were much higher than I anticipated. As this is usually where we see most sites fail. CLS had 46% of Mobile and 47% of Desktop URLs considered good. This averaged at a CLS of 0.29 on Mobile and 0.25 on Desktop .

This also saw less of a correlation against position, though 1st and 2nd tended to be slightly lower:

CLS by Position

When looking at individual pass rates by ranking, we can see a decline in the percentage of ‘good’ URLs as the position moves down the SERP.

Mobile CLS Breakdown Descktop CLS Breakdown

First Contentful Paint

Lastly, while it’s not a CWV I also extracted the FCP from CrUX data as another measure of speed. This saw an average of 2.19s on Mobile and 1.99s on Desktop .

While relatively unchanged on desktop, mobile saw a slight increase in load times by position. But only 0.10s between 1st & 9th:

FCP by Position

What Can You Take Away from This?

Well, not a whole lot… (sorry). This is still a farily small sample, and Core Web Vitals are not an official ranking factor just yet and won’t be till 2021. Meaning their true impact is yet to be seen.

But if you do happen to load up PageSpeed Insights and see the disheartening ‘fail’ message. Fear not, you’re in good company with most other sites.

Will passing the assessment immediately boost your rankings? Probably not. Should you strive to pass it anyway? Definitely.

Regardless of how much ranking benefit speed and CWV’s provide, having your pages as quick, responsive, and stable as possible is great for users and search engines alike.

If you’re looking to make improvements, PageSpeed Insights is a great place to start. And you can easily grab data across your entire site with the SEO Spider and steps here .