Developer

Improve the performance of your app with ASP.NET caching

With ASP.NET, application performance may be improved greatly by the implementation of caching. Get the details on what can be cached and how to control the numerous caching options and parameters.


By Mark Strawmeyer

Application performance is an important consideration for any .NET-based Web site. Migrating from ASP to ASP.NET will likely increase the performance of your site, but the migration alone doesn't guarantee that it will perform at the highest level possible. By utilizing the caching functionality built into ASP.NET, you can greatly improve the performance and scalability of your site.

Caching provides performance enhancements in cases where frequently used items can be accessed from an alternate cached location faster than they can be obtained from the originating location. An example of caching would be where a client browser or proxy server stores a local copy of a frequently accessed Web page. The advantage is that the page can be accessed locally faster than it can be accessed across the Internet each time it is requested.

What kind of caching is available to ASP.NET applications?
The Microsoft.NET framework includes a caching object model that provides two types of caching: data caching and output caching. Both allow you to control the life of the item and other configuration parameters that influence the caching behavior. Output cached items take the form of entire pages or just fragments of a page. Output caching improves performance by allowing pages or fragments of a page to not have to be constructed for each client request.

Items stored in the data cache consist of application objects that are stored in memory so that the application saves time in retrieval of the data from the data source. Each ASP.NET-based application has a private memory cache stored for the lifetime of the application. The cache is cleared each time you restart the application.

Caching an ASP.NET page
The idea behind caching the contents of an entire page is so that the page can be served up all at once from the cache rather than having to regenerate the page for each request that is made. When a page request is made, the cache engine checks its contents to determine if a cached version of the page already exists. If it does, the cached HTML page is sent in response to the client request. If it doesn't, the page is dynamically rendered, returned to the client, and stored in the cache if necessary.

Caching the contents of an entire page is simple to do. The @OutputCache page directive instructs the cache engine on what to do with the page. The directive includes a Duration parameter for indicating how long the page should be stored in cache. In addition, there is a VaryByParm parameter that allows you the flexibility to store multiple versions of the page based on the query string or Web form contents posted to the page. This is extremely handy if there are a limited number of versions of a page that render differently based upon some value within the query string or posted on the Web form.

When multiple parameters affect the page output, the VaryByParam can be set to * to allow it to vary the caching across all parameter combinations. This feature isn't ideal for every page, however. The page should have fairly static content before you decide to cache the entire contents of the page; otherwise, you could consume a large amount of system resources for little benefit.

Caching an ASP.NET page fragment
Caching an entire page isn't always feasible, especially when part of the page content is highly volatile. More often than not, a page may have some content that is static, such as a header and footer, while the body of the page is dynamic, depending upon the user accessing the page or some other variable. In an ideal scenario, the static content would be cached while the rest of the page content would render dynamically when requested. This is precisely what page fragment caching allows you to accomplish in your Web applications.

The fragment caching is implemented through User Controls. The fragments of the page that will be cached are placed in one or more User Controls. The User Controls are then included in different pages as desired. The caching options for the controls are specified by placing an OutputCache directive within each control or programmatically in the code behind using the PartialPageCachingAttribute. The caching parameters are set on the individual controls rather than within the page itself. The User Controls are then cached independently according to each directive, while the remainder of the page content remains dynamic and uncached.

Caching data
While there are parameter inputs for page and page fragment caching, for the most part, the cache engine has the majority of the control and does most of the work. On the other hand, data caching is where the majority of the control is handled programmatically. The Cache class located in the System.Web.Caching namespace is the way in which items are added into and retrieved from the ASP.NET cache. It is relatively straightforward to use in your Web applications. The Cache class provides a dictionary type interface, which means you assign a key to the value when it is stored. The key is used to access the value again at a future point in time when it is needed.

You have the following three options for programmatically putting content into the cache:
  1. Using a name value pair similar to a Dictionary object.
    Example: Cache[“keyName”] = “MyValue”;
    Example: Cache[“keyName2”] = myObject;
  2. Using the Cache.Add method, which returns an object representing the item added into the cache. If an object with the given key name already exists in the cache, then the Add method will fail.
    Example: Cache.Add(“keyName”, “MyValue”);
    Example: Cache.Add(“keyName2”, myObject);
  3. Using the Cache.Insert method, which, unlike the Insert method, does not have a return type. In addition, if an item already exists in the cache with the given key name, the Insert method will override it with the new value given.
    Example: Cache.Insert(“keyName”, “MyValue”);
    Example: Cache.Insert(“keyName2”, myObject);

The Add and Insert methods offer advantages over the straight dictionary interface. These methods allow other options to be set, such as an expiration policy, priority policy, dependencies, and a delegate to notify when the item is removed from cache. These options aren't available using the dictionary interface.

The expiration policy is controlled through the DateTime and TimeSpan data types. This will allow you to set a time to expire the content. The use of DateTime allows you to specify a specific date and time to remove the item from the cache. The TimeSpan allows you to specify a time interval from when the object was last accessed to when it expires from cache.

The priority policy is related to garbage collection. Since the cache is kept in memory, there is a limited amount of memory. When the items in cache exceed the available space, some items will be automatically purged from the cache. The priority policy specified through the CacheItemsPriority enumerated type gives you control over the order in which items will be purged relative to the other items in cache. This way, you have control over which items are of higher importance and will remain cached longer than others.

A validation dependency is used to link the cached item to a file, multiple files, or a directory. When the dependency is changed, the cached item is expired and removed from the cache. The idea is that you can link your cached item to some XML or other content contained in a file. This way, you can pull data from an external source and invalidate the current cache whenever newer data is obtained.

A delegate is called when the item is removed from the cache. This allows you to have notification when the object is no longer in cache so that the appropriate action can be taken, such as putting the item back into cache. This is ideal for use with a validation dependency. When the dependency changes, causing the item to be removed, the delegate can put the item back into the cache based upon the changes.

What should I cache?
Unfortunately, there is no magic answer when it comes to what should be cached. It requires intimate knowledge of the application and how it is going to be utilized. Caching for your application should be considered when:
  • Data is frequently accessed from an external data source such as a database. This data could consist of information such as user IDs, passwords, code dictionary, or other commonly accessed tables. By storing this data in the cache, it will not have to be fetched from the database for each use.
  • If your site is a portal site similar to the ASP.NET Portal example provided in one of the ASP.NET Starter Kits from Microsoft, you could store the page structure information used to render the correct controls in a page when the request is made.
  • Fairly static page contents, such as a header, footer, menus, or company contact information, should be cached as either whole pages or as User Controls.
  • Frequently used forms, such as search forms, should be cached as User Controls so that they can be effectively reused in multiple pages.

How do I evaluate if I’ve made the correct caching choices?
An important component of caching is to look for performance feedback to justify the cache selections and decide what potential adjustments need to be made. ASP.NET provides different application performance counters (e.g., Start, Programs, Administrative Tools, and Performance) that can help provide the appropriate feedback. The following is a list of the performance counters in version 1.1 of the Microsoft .NET Framework:
  • Cache Total Entries shows the total number of entries.
  • Cache Total Hits shows the number of times something was retrieved from the cache.
  • Cache Total Misses shows the number of times something was attempted to be retrieved from the cache but did not exist.
  • Cache Total Hit Ratio shows a ratio of the number of items found in the cache against the number of accesses.
  • Cache Total Turnover Rate is the number of items added to or removed from the cache per second.

The ideal scenario consists of a high number of hits, low number of misses, a high total hit ratio, and a low turnover rate. If the number of misses is too high, you may not be caching enough items. Since the turnover rate is on a per second basis, it is likely to spike as items are added or removed from the cache. If the turnover rate remains steady and fairly high, you are adding too many items into the cache or items are being evicted too quickly to be useful.

Editor's Picks