« Giving up on Microsoft? | Main | Give Hammer a break »

14 rules for fast web pages

Steve Souders of Yahoo's "Exceptional Performance Team" gave an insanely great presentation at Web 2.0 about optimizing website performance by focusing on front end issues. Unfortunately I didn't get to see it in person but the Web 2.0 talks have just been put up and the ppt is fascinating and absolutely a must-read for anyone involved in web products.

His work has been serialized on the Yahoo user interface blog, and will also be published in an upcoming O'Reilly title (est publish date: Sep 07).

We have so much of this wrong at topix now that it makes me want to cry but you can bet I've already emailed this ppt to my eng team. :) Even if you're pure mgmt or product marketing you need to be aware of these issues and how they directly affect user experience. We've seen a direct correlation between site speed and traffic.

This is a big presentation, with a lot of data in it (a whole book's worth apparently), but half way through he boils it down into 14 rules for faster front end performance:

  1. Make fewer HTTP requests
  2. Use a CDN
  3. Add an Expires header
  4. Gzip components
  5. Put CSS at the top
  6. Move JS to the bottom
  7. Avoid CSS expressions
  8. Make JS and CSS external
  9. Reduce DNS lookups
  10. Minify JS
  11. Avoid redirects
  12. Remove duplicate scripts
  13. Turn off ETags
  14. Make AJAX cacheable and small

The full talk has details on what all of these mean in practice. The final slide of the deck is a set of references and resources, which I've pulled out here for clickability:

book: http://www.oreilly.com/catalog/9780596514211/
examples: http://stevesouders.com/examples/
image maps: http://www.w3.org/TR/html401/struct/objects.html#h-13.6
CSS sprites: http://alistapart.com/articles/sprites
inline images: http://tools.ietf.org/html/rfc2397
jsmin: http://crockford.com/javascript/jsmin
dojo compressor: http://dojotoolkit.org/docs/shrinksafe
HTTP status codes: http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html
IBM Page Detailer: http://alphaworks.ibm.com/tech/pagedetailer
Fasterfox: http://fasterfox.mozdev.org/
LiveHTTPHeaders: http://livehttpheaders.mozdev.org/
Firebug: http://getfirebug.com/
YUIBlog: http://yuiblog.com/blog/2006/11/28/performance-research-part-1/
YDN: http://developer.yahoo.net/blog/archives/2007/03/high_performanc.html

Update: Yahoo has summarized these nicely on their developer blog.


Listed below are links to weblogs that reference 14 rules for fast web pages:

» Rules for Fast Web Pages from UFies.org
Interesting reading for the webheads out there is 14 rules for fast web pages: Steve Souders of Yahoo's "Exceptional Performance... [Read More]

» 14 Regeln für schnelle Webseiten im Web 2.0 from Webszene
Ein Mitglied des Yahoo “Exceptional Performance Team”, Steve Souders, hat kürzlich eine gute Präsentation zusammengestellt, wie man Webseiten mit Fokus auf dem Front End optimiert. Hier die Key-Facts: 1. Make fewer HTTP requests ... [Read More]

» 14 rules for fast web pages from michaelmuller.net | Diseño y Desarrollo Web
Steve Souders of Yahoo’s "Exceptional Performance Team" gave an insanely great presentation at Web 2.0 about optimizing website performance by focusing on front end issues. This is a big presentation, with a lot of data in it (a whole ... [Read More]

» 14 rules for fast web pages from roScripts - Webmaster resources and websites
This is a big presentation, with a lot of data in it (a whole book's worth apparently), but half way through he boils it down into 14 rules for faster front end performance: [Read More]

» Links for 2007-05-13 [del.icio.us] from 陈谦的blog
14 rules for fast web pages (Skrentablog) [Read More]

» 14 rules for fast web pages from roScripts - Webmaster resources and websites
Steve Souders of Yahoo's "Exceptional Performance Team" gave an insanely great presentation at Web 2.0 about optimizing website performance by focusing on front end issues. [Read More]

» YSlow: Yahoo's Problems Are Not Your Problems from Coding Horror
I first saw Yahoo's 13 Simple Rules for Speeding Up Your Web Site referenced in a post on Rich Skrenta's blog in May. It looks like there were originally 14 rules; one must have fallen off the list somewhere... [Read More]

» Ranking Web 2.0 sites by server latency from Skrentablog
Server latency is the start of the battle for site performance. There are great tutorials on how to optimise your html, but if your server takes too long sending the bytes out in the first place, there's nothing the browser... [Read More]

Comments (21)

Awesome link, dude. CSS sprites are invaluable!

just looking at the list, I know I have a LOT to learn!

The information about ETags in the presentation seems to be misleading.

ETag for a single entity is always different across servers

ETag format
- Apache: inode-size-timestamp

With Apache, it is possible to configure how etags are generated:

For most clustered configurations, you just need to do:
FileETag MTime Size

If you have other considerations, it is trivial to write a custom apache module, to include other information.

I don't see any real reason to disable ETags -- they generally help.



Interesting read, but how can I modify the expires header of an image or css? Using a file attribute utility, I managed to modify the date on my PC, but its lost in the upload?

Awesome stuff! Thank you so much for taking the time to post the information AND especially for pulling the links out of the PPT and publishing them here.

By the way, I hope your eng team will enjoy the information as much as I have!

And just in case, here are some additional resources that might be helpful for anyone who wants to make their web pages load faster:

Loading Time Checker

Web Page Analyzer

Load Time Testing, etc.

For the list, I highly recommend people use the free command-line tool PNGOUT to compress their PNG's images as well.

PNG's are a great format, unfortunately you'd be surprised at how bloated PhotoShop and other image editing software saves them.

The problem with putting JS at the bottom is that functions won't work until the page has loaded, creating JS errors and breaking the page.

@Paul Querna

He means set the expiration header - not the file time\date.

This means that you can direct the browser to know when or when not to use the cached copy of a file and when to download a new one.

There is no need to download the same css file, for example, every time you refresh a page. Once you download it, the browser should load the cached copy to improve speed.

Set the expiration to a longer period and it will be cached longer.

One thing to remember is to make sure your server will GZip not only HTML pages, but also .js and .css files. Sometimes server configs only compress HTML output by default. JS and CSS files typically end up being larger than most HTML pages, so it's pretty important!

The recommendation to remove ETag should probably be clarified. I think, in general, sending ETag and Last-Modified and supporting Conditional GET are a good idea.

According to the presentation his main problem with them is how most web server software generates the ETag, and how it can be different on each server in the cluster. Apache uses the Inode, MTime (modification time) and Size (file size) to generate the ETag. The INode is going to be different on each server, while the Size and MTime should be the same assuming the content is in sync across all the servers, and the original modification time are preserved.

If you update Apache (or other web server) to use the modification time and size to generate the ETag, and you have a good content syncronization process, I don't see any reason why you can't send ETag (and Last-Modified) in order to support Conditional GET on your cluster.


What kind of re-directs are you talking about? Meta or server side?

Wow, great commentary here... I've learned a lot from this preso, and more from the feedback...

Re redirects -- I assume server-side.

Re etags -- all the feedback makes sense... my approach would be to md5 the content of the served object, which should be independent of the front end node serving the page...


How should I gzip .js and .css files??


12 Remove duplicates
12 a Remove duplicate scripts
12 b Remove duplicate styles
12 c Remove duplicate classes
12 d Remove duplicate spaces
12 e Remove duplicate words (all previous "Remove duplicate" are subjects to be removed)


Rule 0. Cache your dynamic pages. Do not rebuild whole page each time, when user asks it. Cache it, make it static!

Thanks for all the feedback. The Web 2.0 presentation was 3 hours long and we weren't able to cover all of the slides! There is a lot of information behind the 14 rules. You can read the "Rough Cuts" version of the book chapters online now at O'Reilly (you have to pay). You can get snippets from a series of blogs I'm writing for Yahoo! Developer Network. Also, Tenni Theurer has written a great series of articles about some of our research on the Yahoo! User Interface Blog. I'm presenting at OSCon in July. This high level of interest in improving web performance is awesome. I hope to see you at OSCon.


If your server allows you to configure mod_gzip from your .htaccess file, I believe you can add these lines in to compress .js and .css files:

mod_gzip_item_include file \.js$
mod_gzip_item_include file \.css$

Check http://www.webmasterworld.com/forum83/547.htm for a little more info. I found those lines there.


Good point about caching pages. Most small scale to mid-sized sites don't do this, but they should (I imagine large sites do this however). The key is to have a content management system that integrates well with the caching strategy. I found this Wordpress plugin that makes caching pages a snap for anyone who uses WP to manage their site: http://ibloggedthis.com/2006/05/20/wordpress-caching-how-to/

Great tips! I can always use advice as to how to speed up my life. I liked these so much I listed them on ListAfterList.com. If you have any further advice, this seems to be the place to put it. Check it out: http://listafterlist.com/ListResults/tabid/57/ListID/7091/Default.aspx


Ah, the irony: it suggests minifying JavaScript yet the JavaScript for the page is NOT minified!

This is by far the best list for achieving faster websites that I've ever seen. I added some of my own tips at Top Methods for Faster Sites

It is beyond me why there’s no standard for resource packaging (archiving all required files) for HTML.

See my thoughts at http://mindtrickle.wordpress.com/2009/07/03/packaging-resources-in-html-files/

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)


This page contains a single entry from the blog posted on May 10, 2007 6:00 AM.

The previous post in this blog was Giving up on Microsoft?.

The next post in this blog is Give Hammer a break.

Many more can be found on the main index page or by looking through the archives.

Powered by
Movable Type 3.33