It's been a good couple of months since I was asked to write a post about reducing the page load times on this blog. In the meantime there's been a big "will they/won't they" around Google using page load times as a ranking metric, every man and his dog has blogged about the subject and every day Dave's accusing stare is getting harder to ignore.
So, the blog is loading too slowly, apparently. What now?
Measuring Page Load Time
Before you can claim you’ve sped something up, you need to know how fast (or slow) it is in the first place. There are dozens of tools out there that purpose to give you that information. I’m not going to pretend to be an expert on all of them, and I’m sure others have written about them at length. Instead, I’ll just give you my main recommendations.
Yahoo’s YSlow is an addon for an addon – the excellent Firebug for Firefox. It’s great. It shows an easy to understand and comprehensive report on how well optimised a page is for load time. Each of the criteria links to a page on Yahoo’s “Best Practices” site. It even gives you a straightforward A to F grade you can pat yourself on the back with after you’re done.
Not such a snappy name but extremely useful, Pagetest is an Internet Explorer based tool developed by AOL. I don’t recommend running it directly though, they have a publically accessible version with a few hosting locations around the world. Give it a try now on your own site. I used a number of other tools before finding this one and wasn’t too impressed – this one uses a real web browser to fetch the content so it’s a much more accurate measure of what a real user would experience.
There’s also Google’s Page Speed which is similar to YSlow but I didn’t find it as useful. Plus it’s less pretty and isn’t that the important thing?.
Of course, Google Webmaster Tools now has a “Site performance” section under Labs. It’s presumably a good measure of the real world times users are actually seeing, but it doesn’t give much insight into the reasons for those times. I keep an eye on it but wouldn’t recommend it as your main tool – some of the advice it gives is downright infuriating. “Enable gzip compression for http://www.google-analytics.com/ga.js” repeated ten times. Thanks for that.
What’s Taking So Long?
If you’ve run a Pagetest report on your own site you can probably see it right in front of you.
A web browser will have a limit to the number of simultaneous connections it will open to a remote server. If you have more images (or JS, CSS files) referenced on a page than that, some of them won’t even start loading until the first set have finished. You can see that quite clearly on this report. At the bottom of the ‘waterfall view’ in that report, notice that some of the images on the page haven’t even started loading until 7 or 8 seconds after requesting the page. Ouch.
In general, the fewer external files you’re referencing from your page the faster it will load. Unfortunately in the real world, you want lots of pretty images, CSS and dynamic content. So what can you do?
Speeding It Up
Rather than repeat all of the advice you can get from YSlow, I’m just going to list the big things that helped drop the page load time on this blog. In a sort-of descending order of impact:
- Reduced the number of images on the page. I made as many of them as possible in to a CSS “sprite” – essentially, combining small images into one big one and using the CSS background-position property to show the correct part of the image. Take a look at the sprite for this blog and you’ll see what I mean. There’s a very quick and easy way to do this with a bookmarklet called SpriteMe.
- Set up some DNS aliases for static content – img.davidnaylor.co.uk, etc. The more hostnames your page requests from the more parallel downloads you get. Up to a limit. And each additional hostname adds an extra DNS query to look up its IP address, which adds up. This has been covered to death in much more detail than I could provide.
- Moved this site to a dedicated server on a faster connection external to our office – at Fasthosts… which is whole other story in itself!
- Made sure the web server was sending out compressed content (on Apache check out mod_deflate) to reduce the amount of data the user has to download, and that sensible “Expires” headers are being sent so caching will work (mod_expires)There are many other things to try that I haven’t even touched on here, but what I will say is that the law of diminishing returns is in effect, as always.
There are of course many other factors and optimisations to try that I haven’t covered here. These are just what I consider the low hanging fruit. If anyone has any easy one I’ve missed please leave a comment below!
Of course, after I’d done all of that someone (who shall remain nameless) added a TweetMeme button to every post. That button pulls a JS file from tweetmeme.com that sometimes takes upwards of five seconds to load. A few of them on a page and my shiny new page load statistics go out of the window.
The moral of that story is that every time you embed someone else’s content into a page – be it YouTube, Google Analytics, Tweetmeme… – your total page load time is now at their mercy. Just something to keep in mind. That’s my excuse for not linking to a sub-2-second report at the end of this post.
My personal opinion is that, in the near future, I don’t think page load time is going to be a major ranking factor for search engines. From a user experience point of view though, it’s worth investing some time in.