Synthetic performance testing
Synthetic performance testing is using a browser on a server/computer somewhere in the world, load a web page and collect performance metrics. Together with real user measurements (collect performance metrics from real users) is the two ways we collect performance metrics of Wikipedia.
The reason we use synthetic performance testing is to be able to find performance regressions in our code. But we also use the data we collect as a wayback machine, where we can go back a in time and see what kind of content we where serving.
To be able to get stable metrics to find regressions we need to have stable environment for our tests:
- Run our tool on a stable server. At the moment we run our tools on AWS, so far those servers are the one that had the least impact on our metrics
- The tests needs to have stable connectivity, we need to have the same connectivity all the time so that it is not affecting our metrics. We use tc to get that.
- We need to have a stable browser. We keep track of browser updates and browser behave the same all the time.
- The pages we test should stable performance. Depending on how the page is built it can be more or less stable.
We use two different tools to measure the performance of Wikipedia:
- We use WebPageTest to measure the full performance journey from the browser to the server and back.
- We use WebPageReplay to focus on the front en performance. WebPageReplay is a traffic replay proxy that we use to get rid of server and internet flakiness.
Both tools tests are configured in https://gerrit.wikimedia.org/r/#/admin/projects/performance/synthetic-monitoring-tests
We've been using our own WebPageTest setup since 2015. You can read about our setup.
- We have our own WebPageTest server and agent, you can access it at http://wpt.wmftest.org
- If you want to try out a page, you can use the public WebPageTest server https://www.webpagetest.org that is hosted by Patrick Meenan.
- Metrics we collect and graph at Grafana.
- Alerts using WebPageTest
We've been using WebPageReplay since 2017 and you can read about our setup.
CruX/Google Page Speed Insights
We do collect the some overall metrics from Chrome User Experience Report to keep track on how Wikipedia is doing from Chromes point of view. On gpsi.webperf.eqiad.wmflabs we daily run a couple of tests and collect if we are slow/moderate/fast. You can see those metrics on the performance summary dashboard