VIEW SPEECH SUMMARY
- Chrome User Experience Report (CrUX) offers billions of real-user performance metrics for free.
- Treo is a graphical tool that visualizes CrUX data for any domain.
- DevTools: widely known but less focus in this talk; video courses available for deep learning.
- WebPageTest: a powerful, free tool wrapping DevTools for detailed performance tests, though complex and intimidating for clients.
- Use these tools to obtain reliable, realistic, and repeatable performance data.
Key Metrics & Benchmarking
- Agree on what "fast" means before starting any project.
- Recommended starting point: Core Web Vitals (LCP, INP, and Cumulative Layout Shift).
- Enabler metrics such as Time to First Byte (TTFB) help developers but do not directly affect UX.
- DOMContentLoaded event timing is a useful indicator for modern JS frameworks when the app becomes usable.
- Always consider percentile metrics like P75 or P95 to represent the user experience realistically.
- Aim for P95 rather than P75 to ensure a better experience for nearly all users.
Setting Up Tests
- Four main variables in tests: URLs, device type, connection speed, and geographic locale.
- Select URLs based on commercial significance; Search Console can help identify slow pages.
- Device type optimization based on audience split (mobile vs desktop).
- Set connection speeds realistically by comparing test results with CrUX data; avoid overly pessimistic defaults like 3G emulations in developed regions.
- Geographic location should align with your user base for meaningful results.
Testing Considerations
- Avoid relying solely on cold start scenarios (empty cache, no cookies, first visit) as they are pessimistic and not reflective of typical user behavior.
- Design tests to mimic realistic user journeys, including repeated visits and interactions.
- Use WebPageTest scripting to automate user flow scenarios, such as accepting cookies to prevent repeated cookie banners.
- Populate shopping carts or set local storage values before testing key flows like checkout.
- For SPAs, simulate soft navigation by scripting clicks instead of hard URL loads to get accurate timings.
- Use the DevTools Application panel to inspect cookies, local and session storage to pass realistic context into tests.
Throttling Insights
- DevTools throttling is synthetic and does not actually slow network speed, leading to misleading results.
- Use tools like Apple's Network Link Conditioner for real throttling that mimics network conditions accurately.
- Synchronize throttling profiles across WebPageTest, DevTools, and local testing.
Actionable Items
- Use CrUX and Treo to gather real user data for test design.
- Define key metrics and agree on performance goals (preferably targeting P95).
- Select test URLs by commercial importance and validate with Search Console data.
- Choose device types and geographic locations fitting your audience.
- Adjust test connection speeds to match real user conditions (start with 4G/LTE).
- Script WebPageTest to mimic realistic user interactions, including cookie acceptance and shopping cart population.
- For SPAs, script navigation via clicks to reflect true client-side routing.
- Avoid over-reliance on cold start testing; incorporate repeat and interactive scenarios.
- Employ real network throttling tools like Network Link Conditioner instead of DevTools throttling.
- Standardize throttling settings across all testing environments for reliable repeatability.
Summary
- Performance tests must be realistic, repeatable, and reliable.
- Mimicking real user behavior produces more trustworthy and actionable data.
- Proper tooling and test setup help avoid misleading slow results and improve confidence in optimizations.
- Consistent methodologies help communicate performance clearly to clients and teams.
How to Think Like a Performance Engineer
15:40 - 16:10, 27th of May (Tuesday) 2025 / DEV TRENDS STAGE
As awareness and tooling around site speed have been improving at a very exciting rate, has performance testing actually become any easier? Any more straightforward? As someone who spends every day auditing client projects, I think areas of confusion have actually increased in many places. Which tools should we be using? Can we trust them? How do we run tests that serve as realistic and actionable predictors? And how do we know when we’ve won?
In this talk, we’ll look at highly practical tools and workflows to ensure that every test we run has a purpose and gives us data we can truly leverage. By the end, we will all have a shared idea of what effective performance testing looks like, as well as customised and fine-tuned tooling to ensure replicable and predictable tests.