From a3956d77d191579cf2f543eb8c149761cdd0f496 Mon Sep 17 00:00:00 2001 From: soulgalore Date: Thu, 21 Dec 2023 08:30:21 +0100 Subject: [PATCH] docs: pagespeed vs sitespeed.io --- .../sitespeed.io/best-practice/index.md | 19 +++++++++++++++++++ 1 file changed, 19 insertions(+) diff --git a/docs/documentation/sitespeed.io/best-practice/index.md b/docs/documentation/sitespeed.io/best-practice/index.md index 43102d359..c2f64c0b2 100644 --- a/docs/documentation/sitespeed.io/best-practice/index.md +++ b/docs/documentation/sitespeed.io/best-practice/index.md @@ -213,3 +213,22 @@ And a couple of generic things that will make your metrics differ: * **Connectivity matters** - You need to set the connectivity. * **CPU matters** - Running the same tests with the same tool on different machines will give different results. * **Your page matters** - It could happen that your page has different sweat spots on connectivity (that makes the page render faster) so even a small change, will make the page much slower (we have that scenario on Wikipedia). + + +## Difference in metrics between sitespeed.io https://pagespeed.web.dev + +When analyzing web performance data, it's important to understand the source and context of the metrics. The data from the Chrome User Experience Report represents metrics collected by Chrome from users who *"consented"* to share their browsing data. This report reflects the 75th percentile of user experiences, meaning that for the given metric, 75% of the sampled users experienced that performance level or better. For instance, in the example below, 75% of users had a Largest Contentful Paint (LCP) faster than 1.4 seconds, across various devices and network conditions. + +![webdev](https://github.com/sitespeedio/sitespeed.io/assets/540757/aacf206a-24d4-4537-8d57-ebf055d58774) +{: .img-thumbnail} + +Is this data useful? Absolutely, especially if you don’t have your own real user monitoring (RUM) metrics. However, it's important to note that this data is limited to Chrome users who agreed to data collection, potentially skewing the metrics if your audience uses a broader range of browsers like Safari, Edge, or Firefox. + +To optimize your sitespeed.io tests, use these insights to mirror the experiences of the 75th percentile of your user base. For example, you can adjust network throttling in sitespeed.io to match the Time to First Byte (TTFB) reported in the Chrome data. Then, compare First Contentful Paint (FCP) and LCP metrics. If they don't align, consider adjusting CPU throttling, or better yet, test on actual mobile devices. More information on CPU benchmarking for testing, such as with Wikipedia, can be found [here](https://www.sitespeed.io/documentation/sitespeed.io/cpu-benchmark/). + +sitespeed.io even offers a [Chrome User Experience Report plugin](https://www.sitespeed.io/documentation/sitespeed.io/crux/) that lets you directly pull this data from Google for comparison with your sitespeed.io results. + +In summary, consider this approach: + +1. If you have your own RUM metrics, use them to calibrate your sitespeed.io tests. +2. If not, leverage the Chrome User Experience data, keeping in mind its potential limitations, to guide your testing and optimization efforts.