Developer

Code optimization: Memory management in .NET, part 2

Memory management is an important aspect to consider in application design. Memory management in the .NET Framework is handled by the Dispose design pattern. This overview explains the .NET memory management interface.


In the .NET world, memory management is automatic. The recollection of memory, which is not in use, executes as a background task and lets the developer focus more on the task at hand rather than the plumbing work. The first part in this series on garbage collection dealt with how garbage collection in .NET works. In this article, I'll take a look at the interfaces exposed to the developer to aid and control garbage collection.

The Dispose design pattern: IDisposable, Dispose, and Finalize
The Common Language Runtime (CLR) cannot clean up resources like database connections, window handles, and file handles. Therefore, it is necessary for the developer to provide mechanisms for cleaning up these unmanaged resources. The clean up for this can be implemented in the Finalize method. The Finalize method is implemented as a destructor in C# language. The calling of Finalize is still under the control of the garbage collector.

Usually, you would need a deterministic way to clean up these unmanaged resources like file handles. For example if you have opened a file for reading and you have finished loading the contents of the file into a buffer, you might want to explicitly close the file handle. For this explicit clean up, .NET provides the dispose design pattern.

Objects which need explicit clean up of unmanaged resources implement the IDisposable interface. The IDisposable interface consists of the Dispose method, which unlike the Finalize method is under the control of the developer.

Since a call to Dispose represents an explicit clean up, it would not be necessary for the Garbage Collector to collect these objects. Hence a Dispose method should contain a call to GC.SuppressFinalize() notifying the garbage collector that finalization is not needed on that object.

The recommended practice is to implement both Finalize as well as Dispose methods on an object which needs to clean up unmanaged resources. The Finalize method would serve as a backup mechanism in the event that the Dispose is never called. The garbage collector would perform the object finalization and prevent a permanent leak of the unmanaged resource.

Figure A
Dispose design pattern


The code snippet in Listing A demonstrates these concepts more clearly.

The Listing A class SampleClass uses a file handle which is an unmanaged resource. Hence, it becomes necessary for this object to implement IDisposable and also to provide a Finalize method.

The cleanup code, which is the clean up of the file handle, is part of the Dispose method. The GC.SuppressFinalize() is also called once the unmanaged resource has been cleaned up.

The class also provides the destructor (Finalize method). This again contains the code for clean up of the unmanaged resource, which is the file handle.

Weak references
The .NET Framework provides another interesting feature which can be used in implementing various caches. This is the concept of weak reference implemented in .NET as the System.WeakReference class. A weak reference provides a mechanism for referencing an object with the advantage that the referenced object would be available for garbage collection. The ASP.NET cache uses weak references. If the memory usage becomes too high the cache is cleaned up.

Forcing garbage collection
The .NET Framework exposes the System.GC class to the developer to control some aspects of the garbage collector. Garbage collection can be forced by invoking the GC.Collect method. The general advice is not to invoke the garbage collector manually and leave the decision of garbage collection to be automatic. In certain situations, the developer can justify that forcing garbage collection can provide a performance boost. However, great care should be taken when using this method because whenever the garbage collector runs, it suspends all currently executing threads. The GC.Collect method should not be located in a place where it would be called frequently. Doing so would degrade the performance of the application.

Server build and workstation build in .NET
The .NET Framework consists of two builds of the same CLR, each tuned for a specific purpose. These are classified as the server runtime and the workstation runtime, and are implemented in the mscorsvr.dll and mscorwks.dll, respectively. The server build of the CLR is built to take advantage of multiprocessing so that garbage collection can be done in parallel. On a single processor machine only the workstation build gets loaded and it is not possible to load the server build.

Also, there is a further setting known as concurrent garbage collection and nonconcurrent garbage collection for the garbage collector. The nonconcurrent setting is used for server environments where the application does not need to be responsive. In a client environment where a user interface is present and the application needs to be responsive, the concurrent setting is used.

Figure B
Processor environments


Microsoft has created some default settings on the project templates available in Visual Studio .NET, and an ASP.NET application should be able to take advantage of multiprocessors and load up the server build of the CLR. However, since a Windows application is usually user-interface rich, it would load the workstation build of the CLR.

It is possible to override these settings and control which build of the CLR would get loaded on a multiprocessor machine by using the CorBindToRuntimeEx API.

Productivity
With the Dispose design pattern, used in conjunction with the two CLR builds, developers can clean up unmanaged resources. The .NET Framework provides the infrastructure for garbage collection, freeing the developer from the task of keeping track of memory clean up. The developer needs to keep track of only the unmanaged resources which he has used, thus making life easier for the developer and improving productivity.

Editor's Picks