It’s practically a universal truth: just about every website is too slow. There are many, many reasons for this unfortunate fact, of course, but Dave Methvin, president of the jQuery Foundation, has some practical advice on how to get the biggest speed improvements with the least amount of effort.
In a recent speech at PrairieCon, long-time tech journalist turned developer Methvin laid out his approach to why “The Web Doesn’t Have to Be Slow.” This post is adapted from that presentation (all 85 pages embedded in the Slideshare below) and interviews with Methvin.
Find slow stuff… make it not slow
“A lot of times when people are [trying to speed up their sites], they start at the wrong end of the problem,” Methvin said. “They are measuring how fast it takes to do lots of loops, but they aren’t looking for bottlenecks in their code.”
Properly identifying bottlenecks has been an issue for decades: “We were doing this 25 years ago at PC Week when we were measuring network latencies, and people still fall into the same traps today,” Methvin said. The key is to address the issues that really matter and ignore the ones that don’t. For example, does it make sense to run instead of walk to your car before you go on a long road trip? “It really doesn’t matter in the long run,” Methvin said, “because ultimately you are going to be in your car and driving for a lot longer than the amount of time you might save getting to it in front of your house.” This means it is important to scrutinize everything in your code to understand how much each line can impact your performance. “You have to look carefully at when you are time-bound or causing something that is going to have a big effect,” he added.
Because it’s often so difficult to predict where the real bottlenecks are, the first step is to measure where the time is actually going, then optimize from there. Fortunately tools like New Relic are available to help you do that.
The Rules of the Road
Let’s take a closer look at each rule, along with some related advice. (While many developers may be familiar with most of these principles, it’s all too easy to forget to address each one every time):
Avoid 3xx redirects. These penalize performance because you introduce another full-network round-trip to the time it takes you to load the referral page. While you can’t always avoid these kinds of redirects (search engines like canonical URLs, for instance), it’s important to do what you can to keep these referrals to a minimum. “The round trip times across the Internet can kill you and can degrade your performance,” Methvin says.
Start requests early on your pages. Put requests for external resources such as images at the top of the page source code whenever possible, so they will be the first bits delivered to the browser. This will also help with prefetching images for subsequent loads, which should also be part of your HTML coding.
Maximize browser caching. Make sure your caching defaults are set up properly so that stable content (like corporate logos or other frequently used and infrequently changed items) are specified accurately.
Don’t prematurely expire cached content if doesn’t change often. This helps to take maximum advantage of browser caching. The less you have to reload, the faster your pages will appear.
Use domain sharding to spread requests across domains. If you need to load a bunch of resources from a single domain, consider spreading those resource requests across several domains at once to maximize bandwidth (you are downloading elements in parallel rather than in series). Many top-trafficked websites use this technique to boost performance. Domain sharding takes just a couple lines of code and it can have a big impact on your page load times.
One example is to use external jQuery libraries that are already pre-cached by the browser. There are three reasons for doing this:
- Decreased network latency in fetching this information across the Internet so you can download the jQuery routines faster
- Making your page code more parallel so that multiple tasks can run concurrently
- Clients might already have a cached copy of jQuery from some other site that can obviate the need to download any new code at all
Consider a content distribution network. If minimizing requests, maximizing browser caching, and domain sharding are still not enough, a content distribution network (CDN) can do the caching job for a wide collection of browsers. For further discussion of the issue, along with code snippets of how to use Google’s CDN to host JQuery libraries for free, check out Encosia. There are also Google CDN plug-ins for WordPress sites, too.
Load non-critical stuff later. Code your site to wait until after the page content loads to add things that aren’t critical to viewing the page, such as content not initially visible, social media tools, ads, and page analytics.
Modernize browser detection. Another common coding mistake is to employ browser version or client detection so you can push out particular features that depend on more modern browsers. Granted, many users still rely on older browser versions, and most developers want their websites to be as pluralist as possible. But using browser detection in your code is decidedly old school. Putting outdated IE-specific code in your pages can slow down the page rendering for all browsers.
Devote extra scrutiny to third-party code. “Any code that you place inside the <script> tags has complete control over your page’s performance,” Methvin warned. These could be big time wasters if you don’t understand what actions are being specified.
“This is a big topic, just like fixing cars,” Methvin concluded. “The more you do it, the more you understand the depth and dimensions of your problems. But at least following some of my suggestions you can be sure that the resources you put into your browsers are in the best order and optimized for the best possible viewing.”