Core Web Vitals Analysis

Analysis of Core Web Vitals and monitoring opportunities for onpage optimisation

Core Web Vitals

Screaming Frog SEO Spider through connection with PageSpeed Insights API allows you to easily measure Core Web Vitals (CWV) for each page of your website.

Core Web Vitals are a set of metrics that Google has introduced since May 2021 to measure key aspects of the user experience when loading a web page. Core Web Vitals are a real revolution that helps Google evaluate web pages and assign rankings for search results.

At the moment there are essentially three, but I believe that later on, other metrics will be introduced in addition to the refinement of the same, which are increasingly geared toward ensuring a more satisfactory browsing experience.

LCP: Largest Contentful Paint

The Largest Contentful Paint (LCP) metric reports the rendering time of the largest image or block of text visible within the viewport, relative to when the page began loading. As an evaluation parameter, Google considers a maximum time of 2.5 seconds to ensure a good and satisfactory user experience.

To make sure you reach this goal for most of your users, a good threshold to consider is the 75th percentile of page loads, segmented between mobile and desktop devices.

FID: First Input Delay

The First Input Delay (FID) of a page measures its responsiveness to user interactions. For example, if a user clicks on a link, or selects a drop-down menu the FID evaluates how quickly this action is performed on the page.

Usually, user interactions are delayed when a browser is busy performing other tasks.

For example, analyzing the loading timeline below, we see blue bars that identify the time it takes to load a resource (such as a JavaScript file) that takes quite a long time.

If a user clicked on something within the HTML page during this time interval the interaction would be delayed until the browser completed the previous process.

As an evaluation parameter, to provide a good browsing experience for the user, Google indicates that the First Input Delay metric should be equal to or less than 100 milliseconds. To make sure you reach this goal for most of your users, a good threshold to measure is the 75th percentile of page loads, segmented between mobile and desktop devices.

CLS: Cumulative Layout Shift

Cumulative Layout Shift (CLS) is a very important metric focused on user experience that measures the visual stability of a layout while an HTML page is loading. This index was introduced to limit unpleasant unexpected changes due, for example, to banner ads or other. We have all had the frustrating experience of reading a news site and having the text of the article jump further down as the navigation loads; CLS measures precisely this navigational instability.

Below you can see this phenomenon while the page is being loaded by the browser. Through each screen, content is added, causing previous items to move up and down.

To provide a good user experience, sites should strive to have a CLS score of 0.1 or lower. To make sure you reach this goal for most of your users, a good threshold to measure is the 75th percentile of page loads, segmented between mobile and desktop devices.

Core Web Vitals are metrics based on actual Chrome User Experience Report (CrUX) data for the past 28 days and based on user visits to a page.

Each Web Vital is rated according to three classifications; Good, Needs Improvement and Poor.

For a page to be validated by the Core Web Vital Assessments it must be considered “Good” in all three metrics.

Google has provided a convenient illustration of the thresholds for each metric.

Analysis Parameters Core Web Vitals

Core Web Vitals of websites are recorded and stored within the Chrome User Experience Report and you can view them in essentially two ways.

For a single URL, you can use PageSpeed Insights (PSI), which tells you whether the page has passed or failed controls.

The second solution is to consult the Search Console in the “Core Vitals” section, which reports page performance, errors and warnings in relation to these metrics.

Although these two checks are effective, they are also extremely reductive because the first solution allows a unit check of URLs while the second with GSC agglomerates multiple Urls (URL patterns) and does not provide the unit quality scores.

Screaming Frog analysis Web Vitals

Screaming Frog SEO Spider thanks to its connection to PageSpeed Insights API allows you to collect data on Web Vital in an easy way and returns you URL-specific data based on actual CrUX and lab data. To set it up just follow a few simple steps.

  • 1. Connection to the PageSpeed API.
  • 2. Settings of metrics to be considered.

Once you are logged into PSI you need to click on the “Metrics” tab and choose which ones to introduce as analysis parameters. In your first Core Web Vitals scans, I recommend you use the default selection since it includes data from the CrUX report, lab data from Google Lighthouse, and opportunity data from areas for improvement.

  • 3. Scan execution.

With the configurations on API metrics completed, you can start the crawler and wait for the Web site to be scanned and populated with data on Core Web Vitals.

  • 4. Data analysis.

All you have to do is click on the PageSpeed tab to see all the crawl details and which URLs we passed the test, which ones need to be improved and which ones have serious errors.

The example above (with default settings) presents the URLs in the first column while the “Core Web Vitals Assessment” status in the second column. PSI Status is marked “Pass” or “Fail” depending on whether it is considered valid in all three Web Vitals.

In the screenshot you can see that the top URL has an LCP of 1.99s (under 2.5s), a FID of 37ms (under 100ms), and a CLS of 0.06 (under 0.1) so it has passed CWV validation.

Through this overview you are able to immediately examine which pages failed the test and which CWV-specific metrics to act on.

It may happen that some URLs do not get CWV data, this condition occurs when there is not enough data on the speed in CrUX, or because the page does not have enough visitors. Only the most popular pages generally have enough data for the Chrome UX report.

  • 5. Segmentation of error patterns.

Having obtained the data you can examine each individual performance of CWVs using different filters and isolating the issues to be optimized by having each URL with the same critical issues .

By highlighting a URL in the upper window and clicking on the “PageSpeed Details” tab (lower window) you will find a whole host of additional details and opportunities to review errors or warnings found.

Optimize LCP Metrics

We begin by defining what areas can affect the LCP and what factors can be optimized to improve the index metric.

  • Properly Size Images.
  • Defer Offscreen Images.
  • Efficiently Encode Images.
  • Serve Images in Next-Gen Formats.

The starting point involves images, which are one of the pivotal elements for html page performance. To improve LCP metrics make sure each image is properly sized, and use “Lazy Loading” where appropriate.

It is advisable to use new, more efficient image formats such as WebP and improve the encoding of existing image files.

Another very impactful aspect involves server response times. An underpowered server leads to slow responses and longer loading times; therefore, I recommend choosing higher performance hosting or using CDNs -Content Delivery Networks to reduce response times and improve TTFB.

Among the possible optimizations to consider place a lot of emphasis on managing resources that block page rendering. As you know while the HTML is being parsed, any reference to style sheets or CSS can cause the “parser” to pause, delaying the LCP.

Ideally, you should defer any non-critical CSS and JavaScript to speed up the loading of your main content and use “preload” for critical resources. Another tip is to minify CSS and Javascript and remove all unnecessary files from loading your page, also consider using “preconnect” for third-party resources (e.g. Google Analytics, Tag Manager etc.).

  • Minify CSS.
  • Minify JavaScript.
  • Remove Unused CSS.
  • Remove Unused JavaScript.
  • Enable Text Compression.
  • Preconnect to Required Origins.
  • Preload key Requests.
  • Use Video Formats for Animated Content.

Optimize FID

FID in most cases is adversely affected by long execution times of large scripts. The first action to take is to minify and remove unused parts of the scripts, (or entire scripts) reducing the time spent running and rendering these files.

  • Minify JavaScript.
  • Remove Unused JavaScript.
  • Reduce the Number of Third-Party Scripts Used.

To unearth the list of third-party resources you can consult your HTML document within the PageSpeed Details tab.

Another optimization to consider concerns the work of the main thread. If a browser is busy rendering the main thread, the user will not be able to make interactions with the page until this is completed. If you reduce the execution time of the main thread, user interactions will be more responsive and immediate.

The main solutions involve reducing the execution of Javascript scripts and reducing the Main Thread.

  • Minimise Main-Thread Work
  • Reduce JavaScript Execution Time

Then consider that each resource needed for a page takes time to be requested, downloaded, and executed. By reducing both the number of requests needed and the total size of the download, you will help improve page execution time.

Useful activity for this purpose is CSS and Javascript minification, removal of unused style sheets/CSS and Javascript, and text compression.

  • Lower Resource Counts & Sizes.
  • Minify CSS.
  • Minify JavaScript.
  • Remove Unused CSS.
  • Remove Unused JavaScript.
  • Enable Text Compression.

Optimize CLS

As mentioned earlier the stability of the layout is the main component for CLS to ensure a relevant experience for the surfer, this aspect should make you focus on serving the page in the appropriate order and with a defined spacing for each resource so as not to create jolts in navigation.

For satisfactory results, you must ensure that all images, ads, videos, and iframes have the attributes <height> and <width> to allow pages to load with the appropriate spacing and avoid displacement of other content when they are added. Another consideration from a CLS perspective is to give loading priority to top-down resources so that users who have perhaps reached a certain level of browsing depth are not bounced back up.

Make sure that custom fonts do not cause the FOIT/FOUT effect, in fact if custom fonts are applied late in the loading of a page you may experience text flashing while being replaced (FOUT), or invisible text being displayed until the custom font is rendered (FOIT). To get around this latency, I recommend that you use “preloading” of critical resources.

Screaming Frog and Web Vitals

If your URLs show no CrUX data and the PSI status lists ‘success,’ then it is likely that the URL is not receiving enough visits from real users to generate enough velocity data for the CrUX report. If you find yourself in this condition after crawling with Screaming Frog you could use a page with similar layout that has CrUX data for analysis, or use simulated lab data (present by default in the Seo Spider configuration).

When you have to use lab data, the only metric not available is FID – First Input Delay because of the lack of user-requested interaction. In this case you can use Total Blocking Time (TBT) as an indicator to see what resources are being blocked and consequently consider them as critical that can raise the FID.

In case you need to analyze a large site, you might exceed PageSpeed Insight’s quota limit. If you find yourself in this condition I recommend that you analyze a significant sample or wait for the quota to reset.

Don’t forget that CrUX data is based on the last 28 days of user interactions, so any changes to the site will take 28 days to be reflected in CrUX.

Web Vitals Video Tutorial

Seo Spider Tab