PAGESPEED

Performance analysis and Core Web Vitals to improve Search Engine Rankings.

INDEX:

Overview
Pagespeed Insight

Nowadays, the performance of a website is vital for organic ranking in Google Serp, this trend is even more corroborated by the new Core Web Vitals entered among the Ranking factors from May 2021.

Seo Spider’s “PageSpeed” tab helps you collect data from two sources:

  • PageSpeed Insights that uses Lighthouse to check the speed of web pages using “lab data.”
  • Chrome User Experience Report (CrUX, or “field data”) that allows you to get real browsing data.

To populate the data in the “PageSpeed” tab, it is necessary to connect the PSI API.

Configuration > API > PageSpeed Insights

The Metrics

From the Pagespeed tab you have several generic metrics available:

  • Total Size Savings.
  • Total Time Savings.
  • Total Requests.
  • Total Page Size.
  • HTML Size.
  • HTML Count.
  • Image Size.
  • Image Count.
  • CSS Size.
  • CSS Count.
  • JavaScript Size.
  • JavaScript Count.
  • Font Size.
  • Font Count.
  • Medium Size.
  • Media Count.
  • Other Size.
  • Other Count.
  • Third Party Size.
  • Third Party Count.

Additional more specific metrics available are divided between “CrUX Metrics” that correspond to Chrome field data and “Lighthouse Metrix” with lab data released by PageSpeed Insight.

CrUX Metrics

  • CrUX Performance.
  • CrUX First Contentful Paint Time (sec).
  • CrUX First Contentful Paint Category.
  • CrUX First Input Delay Time (sec).
  • CrUX First Input Delay Category.
  • CrUX Origin Performance.
  • CrUX Origin First Contentful Paint Time (sec).
  • CrUX Origin First Contentful Paint Category.
  • CrUX Origin First Input Delay Time (sec).
  • CrUX Origin First Input Delay Category.

Lighthouse Metrics

  • Performance Score.
  • Time to First Byte (ms).
  • First Contentful Paint Time (sec).
  • Speed Index Time (sec).
  • Time to Interactive (sec).
  • First Contentful Paint Score.
  • First Meaningful Paint Time (sec).
  • First Meaningful Paint Score.
  • Speed Index Score.
  • Estimated Input Latency (ms).
  • Estimated Input Latency Score.
  • First CPU Idle (sec).
  • First CPU Idle Score.
  • Time to Interactive Score.

Pagespeed Filters

The Tab, in addition to metrics includes the following filters:

  • Eliminate Render-Blocking Resources: all pages with resources that are blocking the “First Paint” of the page are displayed, along with the potential savings.
  • Properly Size Images: the filter highlights all pages that contain incorrectly sized images and highlights the potential data savings if they were corrected.

Ideally your web page should not serve images larger than the version rendered on the user’s screen. The most appropriate strategy is to serve “responsive images” by generating multiple versions and specifying which ones to use in your Html or CSS via media queries, viewport sizes etc.

Other solutions may be the use of image CDNs or using SVG vector image formats to scale to any size for example icons.

  • Defer Offscreen Images: identifies all pages that contain hidden or offscreen images along with potential savings. To overcome the problem you can consider “Lazy-loading” these images after critical resources to lower the “Time to Interactive” index.
  • Minify CSS: all pages that have unminified CSS files, along with the potential savings if they were minified correctly.

Minifying CSS will improve page load time because normally the CSS used is larger than what is needed. CSS files can contain unnecessary characters, such as comments, whitespace, and indentation.

When put online, these fonts can be safely removed, to reduce file size without affecting the way the browser processes styles. This technique is called minification.

This code can be minified in one line:

  • Minify JavaScript: shows all pages with unminified JavaScript files, along with potential savings. Minifying JavaScript files can reduce payload size and script parsing time.
    Minification is the process of removing whitespace and any code that is not needed to create a smaller but perfectly good code file. For example, Terser is a popular JavaScript compression tool. webpack v4 includes by default a plugin for this library to create minified build files.
  • Remove Unused CSS: identifies pages with unused CSS, along with potential savings in bytes. Usually developers use to add style sheets to a page:

The main.css file that the browser downloads is called an external style sheet and is stored separately from the HTML that includes it in the code. Thus whenever a browser encounters this condition it will have to download, parse and process all the external style sheets it encounters before it can serve any content on the user’s screen.

  • Effectively Encode Images: the filter displays pages with unoptimized images, along with potential savings. Lighthouse collects all JPEG or BMP images on the page, sets the compression level of each image to 85, and then compares the original version with the compressed version. If the potential savings is 4 KiB or greater, Lighthouse marks the image as optimizable.
  • Serve Images in Next-Gen Formats: highlights pages with images served in older formats such as jpeg and png. The advice is to use formats such as AVIF and WebP that have superior compression and quality characteristics.
  • Enable Text Compression: all pages with text-based resources that have not been compressed, along with potential savings.
  • Preconnect to Required Origin: the filter identifies all pages that have connections to domains other than yours but are not yet prioritizing fetch requests with rel=preconnect links, along with potential savings. The Browser before requesting a resource from a server needs to establish a connection with it to look up the domain name and define it into an IP address, to set up a connection to the server, and to encrypt the connection. In all these steps the browser sends data and receives responses from the server, this process is called a “Round Trip” and, depending on network conditions takes time that inexorably affects performance.

To optimize this, you can make use of the “preconnect” of external resources (connections to other domains), which instructs the browser which resources to connect to in the first instance because they are considered a priority.

An example of the application of “preconnect” could be for Google Tag Manager or Google Analytics

“Critical chain requests” correspond to a set of dependent and important network requests for the correct rendering of the html document.

  • Reduce Server Response Times (TTFB): loading speed is essential for a proper user-experience, and users are very sensitive about this. A site that takes many seconds to load has directly proportional abandonment rates. One of the most common causes is related to slow server responses in returning page content to the browser. This filter identifies all pages where the browser had to wait more than 600 ms for the server response in the main document request.
  • Avoid Multiple Redirects: displays pages that have resources that redirect, and the potential savings using the direct URL. Redirects slow down the page loading speed. When a browser requests a resource that has been redirected, the server usually returns an HTTP response like this:

and the browser is forced to make another HTTP request to retrieve the resource at the new location. This consideration is not meant to outlaw the use of 3xxs but when creating chains of redirects or multiple redirects on the same page.
Redirects in resources needed for the “Critical Rendering Path” should always be avoided.

  • Preload Key Requests: identifies html pages that have elements that could be handled with “preload”.
    When a page is requested, the browser requests the html page from the server by parsing the content and sending separate requests for each reference resource included in the document. To improve performance, it is recommended to accelerate this process by requesting critical resources in advance.
    An example application would be to request a font in a CSS file since the browser would only discover it after reading the style sheet. Using the “Preload” function you are going to preload and retrieve a resource that the browser would only consider at a later stage for example in critical chain requests.
    Remember that the browser caches the resource in “preload” without running scripts but having already considered them it will serve them immediately once requested.
    This feature is very suitable for dependencies such as JS, Fonts and CSS.

This feature is a directive to the browser that does not block the loading of the html document. To implement the “Preload” directive, simply add the <link tag> in the <head> section of the html or use the HTTP Link header.

  • Use Video Format for Animated Images: identifies pages with animated GIFs, along with the potential savings if converted to video.
  • Avoid Excessive DOM size: identifies all pages that have a DOM size that exceeds the recommended 1,500 total nodes. Considering DOM is vital because it could negatively affect performance and throughput in several ways:
    • often the DOM includes many nodes that are not visible to the user when loading the page for the first time; this condition unnecessarily increases the volume of data to be served to users and slows down the loading time (load performance).
    • during browsing there are numerous interactions with the web page by users and scripts that cause the browser to rearrange the position and style of DOM nodes creating slowdowns for rendering (runtime performance).
    • If JavaScript uses general query selectors such as document.querySelectorAll(‘li’), it may unknowingly store references to a very large number of nodes, which can overload the memory capacity of users’ devices (memory performance).
  • Reduce JavaScript Execution Time: filter that flags pages with medium or slow JavaScript execution time. When an html page takes a long time in executing javascript scripts, it slows down performance:
    • Increasing the cost of the network: using more bytes increases the download time.
    • Javascript is parsed and compiled in the main thread, which does not remain available for user interaction.
    • Occupying the main Javascript thread also delays the TTI (Time to Interactive) penalizing the Ux of the page and the dedicated web vitals.
    • If the Javascript contains many references it could inexorably consume a lot of memory penalizing the user’s consultation of the page.

To improve JavaScript execution performance, it is generally recommended to “minify” and compress the code, remove unused code, and cache it.

  • Serve Static Assets With An Efficient Cache Policy: returns all pages with resources that are not “cached,” along with potential savings. When a browser requests a resource, the server providing the resource can tell the browser how long it should temporarily store or cache the resource. For each subsequent request for that resource, the browser uses its local copy rather than obtaining it from the network. Fonts, media, images, or style sheets that respond with status code 2xx and have not explicitly declared a “no-cache policy” are considered in Lighthouse evaluations.

Here is an example of caching via HTTP header:

The “max-age” directive instructs the browser how long a resource will remain in the cache. No less than one year is recommended for “static assets.”

  • Minimize Main-Thread Work: identifies all pages with average or slow execution times on the main thread that penalize the rendering of the html document. The browser rendering process turns your code into a web page that users can interact with. By default, the main thread of the rendering process handles most of the code: parses the HTML and builds the DOM, parses the CSS and applies the specified styles, and parses, evaluates and executes the JavaScript.

The main thread also processes user events. Thus, whenever the main thread is busy doing something else, the web page may not respond to user interactions, leading to a bad experience. To improve this process we recommend optimizing third-party javascript, eliminating unused code, reducing CSS references and simplifying (minifying) them, postponing non-critical CSS, splitting javascript payloads.

  • Ensure Text Remains Visible During Webfont Load: all pages with fonts that may blink or become invisible during resource loading.

Export data Performance

All performance data, source pages, and URLs of resources that could be optimized you can export in bulk via the PageSpeed menu.

Reports > PageSpeed

INDEX:

Seo Spider Tab