The holiday shopping season is a critical time for retailers. This year has already been eventful, with outages reported at Neiman Marcus, Target, Time Warner Cable, Sony PlayStation, PayPal, and Royal Bank of Scotland.
As a Software-as-a-Service company ourselves, we appreciate the challenges of keeping things running 24/7. So this year we thought we’d provide some insight into the Black Friday and Cyber Monday performance of two leading e-commerce outfits: Google and Apple.
What we looked at
The scripts ran once a minute from three different locations in data centers near Portland, San Francisco, and Washington, D.C., executing more than 7,000 times from Black Friday through Cyber Monday (November 27-30). For reference, we’ve posted the scripts on GitHub. They stopped just short of actually buying the products, so we won’t have thousands of devices to return.
Our data may indicate that even these Web giants still haven’t fully optimized the buying process during one of the year’s biggest buying periods. At this scale, even small increases in page load times could cost them literally millions of dollars in lost sales. In addition, we believe that simply looking at high-level response times isn’t enough to help understand how to optimize Web performance—we think you need to drill into the data to determine the most important improvements to make.
The scripts yielded a tremendous amount of data: for every Web asset of every page, they captured the following metrics:
- Overall load time for the URL and all dependent URLs
- Time to load and parse the initial HTML document
- Breakdown of load-time activities, including connection, SSL negotiation, DNS resolution, request send, block time, wait time, and receive time
- Request body and header sizes
Overall speediness: Google wins
The first thing we noticed was how Google was consistently fast during the entire period. Its average response time on the phone purchase script was around five seconds:
Apple, in comparison, had average response times more than twice as long, and displayed some noticeable upticks in load times on Black Friday and Cyber Monday:
Not surprisingly, both sites had some outliers that experienced longer response times. But Google’s longest times, 34 seconds, were less extreme than Apple’s, which topped out at over a minute.
The elements of speed
Digging into the network timing components surfaced some interesting trends. Overall network timings for Google averaged less than 60 ms, and contained several milliseconds of SSL negotiation. Even while some other sites were buckling under the strain of Black Friday, Google actually got faster, dropping its average Block time from less than 20 seconds to less than 10 seconds.
Apple, in contrast, had average network timings of around 80 ms, but had faster SSL negotiation times. We also saw some spikiness in Apple’s Receive times on Black Friday and Cyber Monday, although other timing elements remained steady.
When we look at resource load time, Google came out ahead again, even though it took longer to download CSS and to download fonts (something that Apple doesn’t do).
(Third) party time!
One surprise is how much faster Apple was in terms of time spent by third parties. Here we see that Apple had a total average third-party time of 200 ms, split across just three domains, apple.com, cdn-apple.com, and optimizely.com. Interestingly, Optimizely consumes about a third of this time, 70 ms, which shows the performance cost of Optimizely’s A/B testing and personalization services.
Google, in contrast, spent roughly four times as much time accessing third-party URLs from nine different domains. The biggest chunk of that time, around 150 ms, was consumed by fonts.googleapis.com. That begs the question, since numerous studies show that users are more likely to buy from faster websites, why not go with standard fonts? There could be a real cost to Google due to this design decision, which we detail below.
So, why is Google faster than Apple? Synthetics’ results timeline provides some insights. Looking at results for two long Cyber Monday results for both companies, we see three things that Google is doing well:
- Less data. Google is transferring about 75% less data to the browser—3 MB versus 12 MB for Apple.
- Fewer requests. Google is making one third as many requests—114 requests versus 309 for Apple.
- Fewer pages. Google built its purchase process as a single-page app. Apple requires four independent page loads, each of which takes additional time. Both companies’ page loads are highlighted in red below:
Both Apple and Google enjoyed solid performance from Black Friday through Cyber Monday, with no major outages, but Google delivered faster overall performance. Performance-oriented Web teams should consider adopting best practices from both companies, namely:
- Stick to single-page applications to reduce total page load time.
- Use lighter-weight pages, both in terms of bytes transferred and requests made.
- Avoid A/B testing during major sales periods; save your A/B testing for days with lower sales volume.
- Don’t download fonts, since they add to download times.
Throughout all this optimization, it’s important to remember the end goal: improving business outcomes. In particular, every 100 ms in increased page load times reduces sales conversions by 1%, according to an Amazon study. This argues for aggressive performance optimization across Web applications. For example, merely eliminating A/B testing could have improved Apple’s iPhone 6s sales conversions by 1%. Similar principles apply to Google’s 150 ms font download.
We welcome software-powered businesses to try out New Relic Synthetics. It runs in our Software Analytics Cloud, which provides scalability to deliver the detailed metrics required to optimize conversion rates and enable your website to deliver on its business potential.
Neha Duggal contributed to this post.