Last week, I mentioned the caching system in ASP.Net, and after some feedback from readers, I promised to write about caching. To be honest, I am not sure what I could say about it at the fundamental level, since the MSDN library has really everything that you need to know in terms of how to use it. Here are links to the details for versions 1.1, 2.0, and 3.0 of the .Net Framework. So really, I think it would be better for me to focus on caching overall in the context of ASP.Net Web applications.
ASP.Net supports two types of caching: data caching and page caching. Data caching allows you to retain data that would normally fall out of context and be ready for garbage collection after the page has finished processing. Page caching is when you allow the server to store the page’s output and retrieve it from memory instead of reprocessing it. Both caching mechanisms provide functionality for invalidating the cache. When a data element drops out of the cache, you can either catch a callback to regenerate it, or check to see if it is still there when you need it, and regenerate it then. Page caching will simply reprocess the page if the cache is invalidated.
The trick to caching effectively is to understand the tradeoff that it represents. Caching uses memory, and memory is a rather finite resource. The ASP.Net cache scavenges the cache if memory runs low. Luckily, you may set the priority of which items are retained during scavenging; without these clues, the scavenger sweeps out older and rarely used items first. As a result, caching a lot of large objects or pages in cache can end up being counterproductive; if the objects or pages do not stay in the cache for long enough to offset the overhead inherent in caching, there is a net decline in performance. Another point to understand is that caching data that is infrequently used is simply a waste of system resources. In addition, caching data that frequently needs to be invalidated (measured as a percentage of page views, not times per day) is wasteful as well. For example, caching a stock quote ticker that gets displayed three or four times an hour, but needs regeneration every two minutes is a waste of the server’s memory.
The caching in ASP.Net keeps evolving as well. Be sure when deciding whether to use it, that you are evaluating the same version of caching that your application will be running on. For example, the .Net 1.1 Framework did not have SQL Server caching, but .Net 2.0 and 3.0 do. Indeed, the caching in .Net 3.0 was rather simple, with only three types of invalidation (time, file changes, and key changes).
The ability to have SQL Server automatically invalidate caches is quite interesting as well. For SQL Server 2000, it needs to periodically poll the database, and only checks to see if a table has changed. SQL Server 2005 will ping the cache and tell it to invalidate, and it also supports row level invalidations. To be honest, while this seems like an incredibly cool and useful feature, it also creates a massive amount of vendor lock in. You may be better served by having your application’s application layer handle the caching using key caching, and invalidate caches when it calls non-retrieval portions of the data access layer. While that may not be quite as spiffy as the automatic notification of row-level changes, you can do something nearly as good with a well designed database: use the objects in the cache to hold onto the main record ID of records (such as the record ID for the employee table) and cascade invalidation for that main record in the database to the other associated, cached items (such as the payroll table data for that employee). While this may be a good bit of effort, it may pay off to keep your application vendor neutral.
While caching can seem like a performance magic bullet, it really must be used judiciously. A poor choice in what data to cache or not to cache can hurt your performance or waste server resources, making the problem worse than before. Weigh your options and perform some load testing to see how it works. Your best bet is probably to mock up a quickie version of your application; once that emulates the backend performance (using deliberate slowdowns to replicate processing time), put it under a realistic load to see if the caching yields dividends or not. The page caching is fairly straightforward to set up and use, at least with simple invalidation rules (time, files), but the application data caching and more advanced page invalidations are a lot of work to try to bolt on to existing code. Caching is something to be planned for from the get-go, not used as a performance afterthought.
J.Ja