This is a guest post written by Yusuf Bhana, Digital Marketing Manager at global translation agency TranslateMedia. In addition to developing their own translation management technology, TranslateMedia was ranked in the top 10 fastest growing digital media and technology companies in Europe in 2010.
Since the advent of the World Wide Web in the early 1990s, technology has advanced at a rapid rate. There has been a huge increase in available bandwidth and the devices connected to the internet continue to improve in speed and performance. The speed at which technology has developed has promised to usher in an era of lightning fast internet. Yet, it still seems slow today. Why is that?
Many have argued that although global internet connection speeds have undoubtedly improved, the volumes of data that are currently transmitted across the internet have increased substantially as audiences demand greater interactivity and richer content and presentation.
The text-based online experiences of the 1990s have been replaced by interactive image galleries, video and slick user interfaces that rely on lots of images and libraries of scripts to allow them to function across all browsers and devices.
As a result, web pages have become a lot bigger. Webperformancetoday.com revealed that since 2010, the average web page has almost doubled in size from just over 600Kb to 1.2Mb, driven largely by an increase in images and scripts:
While web pages are increasing in size, average speeds globally are failing to keep up. Akamai showed that average global connection speeds increased by just 13% between 2012 and 2013 — and surprisingly actually declined by 5% between Q2 and Q3 in 2013.
This means that one can’t rely on technology outpacing the needs of the customer in the near or even distant future. As a result, many organizations have started to invest in front-end optimization and content delivery networks to improve speeds and increase user engagement.
For large companies, the investment of tens (or hundreds) of thousands of dollars on hardware and software to improve performance may be cost-effective, but smaller organizations often have a hard time measuring the issues caused by performance and justifying the investment.
From a technical to a marketing problem
In the past, site performance was considered a technical problem to be solved by developers. More recently marketers have become more interested in page load times, and it’s easy to see why.
The Aberdeen Group found that a one-second delay in load time resulted in 11% fewer page views, a 16% decrease in customer satisfaction and a 7% decrease in conversion. Akamai conducted a study which revealed that 47% of customers expect to wait no longer than 2 seconds for a web page to load. So slow page loads may be killing your online business, whether you’re reliant on online sales or advertising revenue as your main source of income.
Slow-loading web pages may even be affecting your site’s ability to attract visitors. In 2010, Google announced that website speed would begin to have an impact on search rankings. Google also uses site speed to determine overall quality scores of paid search advertising. This means that slow sites may have to spend more on advertising than their speedier competitors to achieve the same reach.
I’m responsible for digital marketing for a global translation agency called TranslateMedia. Website performance monitoring has become a priority for the company as more clients are being acquired overseas from SEO and paid search advertising.
As a result, we’ve installed New Relic on our site to see if it can help us identify performance issues and improve success.
Using New Relic to delivery actionable insights on page speed
There are many tools available online that allow site owners to measure the performance of their sites, but few provide the information required to resolve performance issues both on the server and within the application.
New Relic monitors many critical system metrics such as CPU usage, physical memory, network activity, processes, and disk I/O utilization and capacity.
Before measuring front-end performance it’s important to ensure that the server that hosts your site has the available resources to cope with the volume of requests that it receives. High CPU or memory usage can contribute to slow page load times and, in extreme cases, make your site unresponsive. These are often the easiest to resolve by upgrading hardware, increasing bandwidth or using data compression to minimize disk utilization and bandwidth usage.
The following server monitoring showed that our server was not experiencing any issues, and therefore performance was likely to be impacted by the application itself.
New Relic’s application monitoring allows you to look inside the code or CMS that runs your site to see where bottlenecks are being created that impact performance.
The report splits the application into three distinct components: PHP, Database and Web External. The report showed that most of the response time was a result of the PHP application, with a tiny bit of latency on the database and a little less time spent in external services (e.g. social media plugins, analytics APIs etc.).
While application performance could certainly be improved, a response time of around 600ms was not considered to be the main issue. This helped us to realize that the performance issues were occurring between the web server and the user. Luckily, New Relic provided the tools to investigate this further.
Browser page load time
Perhaps the most useful tool within New Relic’s suite of applications is the browser page load time analysis. The tool uses RUM (Real User Monitoring) to measure the page load times experienced by actual users. The load time is split into Web Application, Network, DOM Processing and Page Rendering metrics, describing how long the page is taking to access, download and render.
The report showed that average page load times were around 8.5 seconds, way above Akamai’s recommended limit of 2 seconds. Also, the time spent on the application and network was negligible. This meant that any optimization efforts we undertook were better focused on improving DOM processing and page rendering.
Since the TranslateMedia web server is located in London and the speed issues are largely confined to the front-end, there was likely to be a huge difference in page load times in different countries around the world. New Relic provides an additional tool that allowed us to analyze this.
What was particularly concerning was that page load times were poor in all of TranslateMedia’s key markets of UK, USA, France, Germany and China with page load times of 5, 8, 7, 11 and 19 seconds respectively.
Furthermore, New Relic provided the ability to determine average load times by region within a country. A quick review of the report showed slow load times of 8 to 10 seconds in our key regions in the US.
So … what to do?
The data provided by New Relic allowed us to assign resources to the areas of site performance that had the biggest issues; as a result we were able to devise a strategy for improving page speed, which included:
Utilize a CDN
Due to the fact that the site receives traffic from all around the world, it could really benefit from delivering static resources using a content delivery network. This would allow users in geographically distant key markets such as China and USA to retrieve images, CSS and JS files and other static content from servers nearer to their own location — drastically reducing load times.
By enabling gzip compression on the server, requests should contain smaller amounts of data which will substantially improve delivery speed.
By using lossless compression on images the weight of pages should be drastically reduced and speeds improved.
Reduce the number of HTTP requests
It will be interesting to see the reports in New Relic as we’re implementing our strategy and the resulting changes in our bounce rate, time on site and conversion rate in our analytics package.
- The average web page has almost doubled in size since 2010 — Web Performance Today
- The State of the Internet (2nd Quarter, 2013 Report) — Akamai
- The Performance of Web Applications: Customers are Won or Lost in One Second — Aberdeen Group
- Akamai Reveals 2 Seconds as the New Threshold of Acceptability for eCommerce Web Page Response Times — Akamai
- How Website Speed Actually Impacts Search Ranking — The Moz Blog