Etsy Icon>

Code as Craft

Q1 2014 Site Performance Report main image
q1-2014-site-performance-report

Q1 2014 Site Performance Report

  image

May flowers are blooming, and we’re bringing you the Q1 2014 Site Performance Report. There are two significant changes in this report: the synthetic numbers are from Catchpoint instead of WebPagetest, and we’re going to start labeling our reports by quarter instead of by month going forward.

The backend numbers for this report follow the trend from December 2013 - performance is slightly up across the board. The front-end numbers are slightly up as well, primarily due to experiments and redesigns. Let’s dive into the data!

Server Side Performance

Here are the median and 95th percentile load times for signed in users on our core pages on Wednesday, April 23rd:

site-performance-charts-05-2014-01

There was a small increase in both median and 95th percentile load times over the last three months across the board, with a larger jump on the homepage. We are currently running a few experiments on the homepage, one of which is significantly slower than other variants, which is bringing up the 95th percentile. While we understand that this may skew test results, we want to get preliminary results from the experiment before we spend engineering effort on optimizing this variant.

As for the small increases everywhere else, this has been a pattern over the last six months, and is largely due to new features adding a few milliseconds here and there, increased usage from other countries (translating the site has a performance cost), and overall added load on our infrastructure.  We expect to see a slow increase in load time for some period of time, followed by a significant dip as we upgrade or revamp pieces of our infrastructure that are suffering. As long as the increases aren’t massive this is a healthy oscillation, and optimizes for time spent on engineering tasks.

Synthetic Front-end Performance

Because of some implementation details with our private WebPagetest instance, the data we have for Q1 isn't consistent and clean enough to provide a true comparison between the last report and this one.  The good news is that we also use Catchpoint to collect synthetic data, and we have data going back to well before the last report.  This enabled us to pull the data from mid-December and compare it to data from April, on the same days that we pulled the server side and RUM data.

Our Catchpoint tests are run with IE9 only, and they run from New York, London, Chicago, Seattle, and Miami every two hours.  The "Webpage Response" metric is defined as the time it took from the request being issued to receiving the last byte of the final element on the page.  Here is that data:

site-performance-charts-05-2014-02

The increase on the homepage is somewhat expected due to the experiments we are running and the increase in the backend time. The search page also saw a large increase both Start Render and Webpage Response, but we are currently testing a completely revamped search results page, so this is also expected.  The listing page also had a modest jump in start render time, and again this is due to differences in experiments that were running in December vs. April.

Real User Front-end Performance

As always, these numbers come from mPulse, and are measured via JavaScript running in real users’ browsers:

site-performance-charts-05-2014-04

No big surprises here, we see the same bump on the homepage and the search results page that we did in the server side and synthetic numbers. Everything else is essentially neutral, and isn’t particularly exciting. In future reports we are going to consider breaking this data out by region, by mobile vs. desktop, or perhaps providing other percentiles outside of the median (which is the 50th percentile).

Conclusion

We are definitely in a stage of increasing backend load time and front-end volatility due to experiments and general feature development. The performance team has been spending the past few months focusing on some internal tools that we hope to open source soon, as well as running a number of experiments ourselves to try to find some large perf wins. You will be hearing more about these efforts in the coming months, and hopefully some of them will influence future performance reports!

Finally, our whole team will be at Velocity Santa Clara this coming June, and Lara, Seth, and I are all giving talks.  Feel free to stop us in the hallways and say hi!