Find out more about our consulting services. McLean. Boston. New York.
These techniques are key in the subject of Web site performance tuning, at least in terms of how your code is put together. Since I’ve been doing a fair amount of ASP.NET and Sitecore work in the last year or so, this has been on my mind a lot.
Milan is a guy who’s Web site I’ve long been a fan of for a number of reasons. He has in particular featured exceptional content on the Web site performance subject over at aspnetresources.com.
Milan’s site first came to my attention as one the few sites out there to focus largely on ASP.NET and Web standards-based approaches, which is a mix, at least back in the day, you didn’t see very much. That said, another subject which is clearly near and dear to his heart is performance. He’s also written about the subject and has also posted a great tool I find myself frequenting from time to time that analyzes how much savings you can get out of HTML using GZIP to compress your code on the server.
In recent years there’s been a ton of research on how to speed up load times on Web sites. Two fairly major concepts are reducing HTTP connections from the Web browser to the server, and using GZIP compression. Every connection to the server takes packets and bytes, and did you know that connections are made to the server even on items that might be cached? Ever heard of a 304 “Not Modified” response? It depends how your server is configured, but the browser might ask the server if a file has changed, even if it hasn’t — and the server then responds as such. There’s ways to prevent that, but if you’re not doing those things, every connection is work the browser, your computer, and the server is doing. It might be quick, but they add up. You can set “expires” headers which help reduce those lookups, but it can be complex (or even impossible) to set up in some environments.
Not only that, but as icing on the cake, it also minifies and GZIPs the results so that when it’s returned to the client (your browser) it comes down the pipe in smaller bits.
I’ve blogged about it before, both here at NavArts.com as well as on my personal site Cherny.com (here and here). And, in that book I contributed to, there’s a chapter from Kevin Lawver and an Appendix from yours truly which touches on many performance enhancements.
Previously at NavArts we’ve used GZIP repeatedly on projects that in particular seemed too heavy for their own good, such as Behind the Badge (DC United’s blog). Additionally my coworkers Tim Stephens and Corey Burnett configured the new NavigationArts.com site to use GZIP. If you haven’t given it a shot in the past, I think it’s something well worth looking into.
For more information on configuring GZIP and such, there’s tons of resources out there, all you need to do is look for them:
For more information on troubleshooting and debugging, there’s client side and browser-based tools which can help identify the slow parts of your pages and even help inspect the requests and the responses from the server on your site: YSlow from Yahoo! and Google Page Load. Even great utilities such as Charles and Fiddler are great help when tuning performance, I can’t recommend them enough, honestly, as they help inspect every HTTP connection your browser makes. Charles even can add up the bytes in those 304 connections, if for some reason you don’t believe me…