How To Hack Your Google Lighthouse Scores In 2024<\/h1>\nSalma Alam-Naylor<\/address>\n 2024-06-11T18:00:00+00:00
\n 2024-08-30T10:05:08+00:00
\n <\/header>\n
This article is sponsored by Sentry.io<\/b><\/p>\n
Google Lighthouse has been one of the most effective ways to gamify and promote web page performance among developers. Using Lighthouse, we can assess web pages based on overall performance, accessibility, SEO, and what Google considers \u201cbest practices\u201d, all with the click of a button.<\/p>\n
We might use these tests to evaluate out-of-the-box performance for front-end frameworks or to celebrate performance improvements gained by some diligent refactoring. And you know you love sharing screenshots of your perfect Lighthouse scores on social media. It\u2019s a well-deserved badge of honor worthy of a confetti celebration.<\/p>\n<\/a><\/figure>\nJust the fact that Lighthouse gets developers like us talking about performance is a win. But, whilst I don\u2019t want to be a party pooper, the truth is that web performance is far more nuanced than this. In this article, we\u2019ll examine how Google Lighthouse calculates its performance scores, and, using this information, we will attempt to \u201chack\u201d those scores in our favor, all in the name of fun and science<\/strong> — because in the end, Lighthouse is simply a good, but rough guide for debugging performance. We\u2019ll have some fun with it and see to what extent we can \u201ctrick\u201d Lighthouse into handing out better scores than we may deserve.<\/p>\nBut first, let\u2019s talk about data.<\/p>\n
Field Data Is Important<\/h2>\n
Local performance testing is a great way to understand if your website performance is trending in the right direction, but it won\u2019t paint a full picture of reality. The World Wide Web is the Wild West, and collectively, we\u2019ve almost certainly lost track of the variety of device types, internet connection speeds, screen sizes, browsers, and browser versions that people are using to access websites — all of which can have an impact on page performance and user experience.<\/p>\n
Field data — and lots of it — collected by an application performance monitoring<\/a> tool like Sentry from real people using your website on their devices will give you a far more accurate report of your website performance than your lab data collected from a small sample size using a high-spec super-powered dev machine under a set of controlled conditions. Philip Walton reported in 2021 that \u201calmost half of all pages that scored 100 on Lighthouse didn\u2019t meet the recommended Core Web Vitals thresholds<\/a>\u201d based on data from the HTTP Archive.<\/p>\nWeb performance is more than a single core web vital metric<\/a> or Lighthouse performance score. What we\u2019re talking about goes way beyond the type of raw data we\u2019re working with.<\/p>\nWeb Performance Is More Than Numbers<\/h2>\n
Speed<\/em> is often the first thing that comes up when talking about web performance — just how long does a page take to load? This isn\u2019t the worst thing to measure, but we must bear in mind that speed is probably influenced heavily by business KPIs and sales targets. Google released a report in 2018<\/a> suggesting that the probability of bounces increases by 32% if the page load time reaches higher than three seconds, and soars to 123% if the page load time reaches 10 seconds. So, we must conclude that converting more sales requires reducing bounce rates. And to reduce bounce rates, we must make our pages load faster<\/em>.<\/p>\nBut what does \u201cload faster\u201d even mean? At some point, we\u2019re physically incapable of making a web page load any faster. Humans — and the servers that connect them — are spread around the globe, and modern internet infrastructure can only deliver so many bytes at a time.<\/p>\n
The bottom line is that page load is not a single moment in time. In an article titled \u201cWhat is speed?<\/a>\u201d Google explains that a page load event is:<\/p>\n[\u2026] \u201can experience that no single metric can fully capture. There are multiple moments during the load experience that can affect whether a user perceives it as \u2018fast\u2019, and if you just focus solely on one, you might miss bad experiences that happen during the rest of the time.\u201d<\/p><\/blockquote>\n
The key word here is experience<\/em>. Real web performance is less about numbers<\/em> and speed<\/em> than it is about how we experience<\/em> page load and page usability as users. And this segues nicely into a discussion of how Google Lighthouse calculates performance scores. (It\u2019s much less about pure speed than you might think.)<\/p>\nHow Google Lighthouse Performance Scores Are Calculated<\/h2>\n
The Google Lighthouse performance score is calculated using a weighted combination of scores based on core web vital metrics (i.e., First Contentful Paint (FCP), Largest Contentful Paint (LCP), Cumulative Layout Shift (CLS)) and other speed-related metrics (i.e., Speed Index (SI) and Total Blocking Time (TBT)) that are observable throughout the page load timeline<\/strong>.<\/p>\nThis is how the metrics are weighted<\/a> in the overall score:<\/p>\n
\n 2024-08-30T10:05:08+00:00
\n <\/header>\n
Just the fact that Lighthouse gets developers like us talking about performance is a win. But, whilst I don\u2019t want to be a party pooper, the truth is that web performance is far more nuanced than this. In this article, we\u2019ll examine how Google Lighthouse calculates its performance scores, and, using this information, we will attempt to \u201chack\u201d those scores in our favor, all in the name of fun and science<\/strong> — because in the end, Lighthouse is simply a good, but rough guide for debugging performance. We\u2019ll have some fun with it and see to what extent we can \u201ctrick\u201d Lighthouse into handing out better scores than we may deserve.<\/p>\n But first, let\u2019s talk about data.<\/p>\n Local performance testing is a great way to understand if your website performance is trending in the right direction, but it won\u2019t paint a full picture of reality. The World Wide Web is the Wild West, and collectively, we\u2019ve almost certainly lost track of the variety of device types, internet connection speeds, screen sizes, browsers, and browser versions that people are using to access websites — all of which can have an impact on page performance and user experience.<\/p>\n Field data — and lots of it — collected by an application performance monitoring<\/a> tool like Sentry from real people using your website on their devices will give you a far more accurate report of your website performance than your lab data collected from a small sample size using a high-spec super-powered dev machine under a set of controlled conditions. Philip Walton reported in 2021 that \u201calmost half of all pages that scored 100 on Lighthouse didn\u2019t meet the recommended Core Web Vitals thresholds<\/a>\u201d based on data from the HTTP Archive.<\/p>\n Web performance is more than a single core web vital metric<\/a> or Lighthouse performance score. What we\u2019re talking about goes way beyond the type of raw data we\u2019re working with.<\/p>\n Speed<\/em> is often the first thing that comes up when talking about web performance — just how long does a page take to load? This isn\u2019t the worst thing to measure, but we must bear in mind that speed is probably influenced heavily by business KPIs and sales targets. Google released a report in 2018<\/a> suggesting that the probability of bounces increases by 32% if the page load time reaches higher than three seconds, and soars to 123% if the page load time reaches 10 seconds. So, we must conclude that converting more sales requires reducing bounce rates. And to reduce bounce rates, we must make our pages load faster<\/em>.<\/p>\n But what does \u201cload faster\u201d even mean? At some point, we\u2019re physically incapable of making a web page load any faster. Humans — and the servers that connect them — are spread around the globe, and modern internet infrastructure can only deliver so many bytes at a time.<\/p>\n The bottom line is that page load is not a single moment in time. In an article titled \u201cWhat is speed?<\/a>\u201d Google explains that a page load event is:<\/p>\n [\u2026] \u201can experience that no single metric can fully capture. There are multiple moments during the load experience that can affect whether a user perceives it as \u2018fast\u2019, and if you just focus solely on one, you might miss bad experiences that happen during the rest of the time.\u201d<\/p><\/blockquote>\n The key word here is experience<\/em>. Real web performance is less about numbers<\/em> and speed<\/em> than it is about how we experience<\/em> page load and page usability as users. And this segues nicely into a discussion of how Google Lighthouse calculates performance scores. (It\u2019s much less about pure speed than you might think.)<\/p>\n The Google Lighthouse performance score is calculated using a weighted combination of scores based on core web vital metrics (i.e., First Contentful Paint (FCP), Largest Contentful Paint (LCP), Cumulative Layout Shift (CLS)) and other speed-related metrics (i.e., Speed Index (SI) and Total Blocking Time (TBT)) that are observable throughout the page load timeline<\/strong>.<\/p>\n This is how the metrics are weighted<\/a> in the overall score:<\/p>\nField Data Is Important<\/h2>\n
Web Performance Is More Than Numbers<\/h2>\n
How Google Lighthouse Performance Scores Are Calculated<\/h2>\n