You always need to take some choices regarding performance on sites with a lot of traffic, and compression can be a solution. Of course, you need to be aware of the consequences by compressing. The CPU on the server and client will do some more work compressing and decompressing, but bandwidth will be significantly better.
By using the standard compression techniques like GZip and Deflate most modern browsers support compressed files. This was suggested in HTTP 1.0 and specified in HTTP 1.1. This also means that most crawlers and other services using standard protocols support compressed content.
So, what do you need ?
There are several ways to compress content and files. One is to let IIS compress it for you. I haven't tried this as people are reporting some hacking in the metabase in IIS to achieve this. Seems NOT the way to go, if you know what the metabase is.
The other approach is to write your own HttpModule and let this do the job. There are of course commercial products doing it for you, but it's not really a lot of code to create this module. You can start downloading Ivan Porto Carrero's excellent example. Compile it and add the http module to your web.config file. You should now see that the ASPX page is reduced to a minimum. Well, if you don't see it, download the Web Developer Firefox plug-in and you should get something like this (Information -> Document Size):
Notice! You should exclude WebResource.axd and ScriptResource.axd in the section for the HttpCompression handler in web.config. If you compress WebResource.axd, the standard .NET validating scripts and postback script will stop working. ScriptResource.axd is used by MS AJAX and has it's own compression module that will take care of the AJAX scripts. Typical the HttpCompression section should look like this:
<add path="scriptresource.axd" />
<add path="webresource.axd" />
<add mime="image/jpeg" />
<add mime="image/jpg" />
<add mime="image/gif" />
<add mime="image/png" />
Now you should should have reduced the bandwidth to serve the page. As mentioned earlier, this might not be the best approach for all sites. You need to consider whether you have enough CPU on the server to compress everything or if you have enough bandwidth to serve files uncompressed. If you are short on both, then you probably should consider buying new hardware or upgrade your bandwidth . Also remember that the files are decompressed and cached in the users browser. This is not the case on the server. Every page requested will be compressed after the page is rendered.
No comments yet.
Leave a comment
- porn subscriptions on Use regular expressions in Visual Studio to clean up code
- Shanna on Don't steal images – a great story
- wordpress.com on Don't steal images – a great story
- timts.edu.in on Use regular expressions in Visual Studio to clean up code
- Tridip on Using log4net in Web Applications – a real-life example