Modern websites necessarily make a lot of requests for resources back to the server. Through books like those by Steve Souders we’ve become increasingly aware that each of these requests causes a delay in whatever it is your user has requested being downloaded and displayed. If you don’t have both of Steve’s books, get them…and spend an hour or two going through his site…it’s awesome stuff. The term commonly used for the various efforts to reduce the number of requests made to a server is HTTP Optimization. This is a topic I’ve been investigating / playing with for quite a while (e.g., CSS Combiner / Minifier ASP.NET Control) and in fact one of the projects I worked on shortly before leaving the ASP.NET team morphed into the Sprite and Image Optimization Framework recently released as a preview by the ASP.NET team. What I wanted to do with this post wasn’t to go over these techniques in any great detail…rather it’s to cover some of the stuff I currently do when developing sites. In short there’s x techniques you can realistically use to reduce / improve the performance of a web page loading:1. Make the server code run REALLY fast…either by buying big servers, writing really slick code or by caching everything so it basically works like a static (plain HTML) site… 2. Make your web page really plain…no CSS / JS / Images…just plain old HTML; single request is always quickest – this is of course not usually acceptable!3. Radically reduce the number of requests made to the server. 4. Make each request deliver more content for less bandwidth.Of these I’m going to take a closer look at the last two in more detail.The very first thing you need to do when trying to work out how to fix page loading issues is to actually identify those issues! I tend to use Firefox when I’m looking at these issues; mainly because it has an amazingly powerful tool in Firebug for debugging and investigating pretty much anything a web site does on the client.Read more: mostlylucid
0 comments:
Post a Comment