The .NET Framework provides the necessary infrastructure to write multithreaded applications. A thread represents a part of a program which can be run independently. It is also the basic unit to which an operating system allocates processor time.

Having multiple threads in this way can improve the performance of the application in a big way. An efficient way to manage multiple threads in the .NET Framework is to use the ThreadPool class.

States of a thread: Basics of threading
Before we proceed any further, it is important to understand the operations that can be performed on a thread and the resulting states on the thread due to those operations.

The states of a thread in the .NET Framework are defined as the ThreadState enumeration. The possible states are Unstarted, Running, WaitSleepJoin, SuspendRequested, Suspended, AbortRequested, and Stopped.

Figure A
Thread States

Figure A shows the states of a thread and the operation which causes the thread to transition to that particular state.

The ThreadPool class
The ThreadPool class provides the easiest way to handle multiple operations. The class provides the QueueUserWorkItem, which can be used to queue the task to be performed. The WaitCallBack delegate is used to pass the method to be called when the thread gets the chance to execute. All the threads that are part of a ThreadPool are set with a default priority. In fact, the threads in a ThreadPool run as background threads; therefore, if the main thread exits, the background threads will not be kept alive.

The ThreadPool finds its usage in the .NET Framework as well as in scenarios such as Asynchronous I/O. Even .NET Remoting internally makes use of a ThreadPool to provide threads to service Remoting calls in the Single Call mode.

Let’s take a look at the sample code in Listing A to find out how we can utilize a ThreadPool. The Main method demonstrates the way a ThreadPool class is used. The operations that we are going to perform are created as methods of the class Sample2. These are named as PerformOperation1 and PerformOperation2.

Listing A demonstrates calling a method and also passing data (parameters) to the queued worker thread. Let’s look at Listing A and the steps involved in using the ThreadPool.

  • STEP1: The static method QueueUserWorkItem on the ThreadPool class is called and the operations to be performed, PerformOperation1 and PerformOperation2 are passed as parameters to the WaitCallback delegate.
  • STEP2: In case data has to be passed to the worker thread, this is encapsulated as an object, shown in Listing A as the DataHolder class. This is passed as the second parameter to the overload of QueueUserWorkItem which takes two parameters.
  • STEP3: The main thread goes in a loop and waits for the operations to be completed. This is because the worker threads in the pool run as background threads and would not remain alive when the main thread completes.
  • STEP4: The corresponding operations are queued and processed. For PerformOperation2, the data is unpackaged by casting the parameter passed back to the DataHolder object.
  • STEP5: In each of the operations, the counter is incremented using the Interlocked class to overcome any threading issues and increment the value of the variable as an atomic operation.
  • STEP6: Once the operations are completed, the Main thread can exit.

Advantages of a ThreadPool
The .NET ThreadPool provides a convenient way to manage multiple threads. The task of thread management is taken care of by the infrastructure, which allows the developer to focus on the business logic rather than the technical nuances.

Also, since the infrastructure is managing the thread pool, it can optimize the scheduling of threads by taking into account not only the process in which the application is running but also all processes running on the system. A ThreadPool ensures that the CPU utilization is high by launching queued worker threads when the CPU is free.

Deadlocks and race conditions
It is important to have a good understanding of the consequences that may arise due to the usage of threading in applications. Deadlocks and race conditions arise when the scenarios and code paths that may be traced are not well understood.

A deadlock is a situation that occurs when multiple threads wait on each other. This could be because both threads need to use the same resource at the same time and each is waiting for the other to free it.

Figure B

Figure B shows a deadlock situation between Thread1 and Thread2. Thread1 has a lock on Resource1 and is waiting for Thread2 to release Resource2. Thread2 has a lock on Resource2 and is waiting for Thread1 to release Resource1.

Deadlocks can be prevented at the design level by adopting diagramming techniques on the various scenarios possible. In code, various synchronization mechanisms need to be added to prevent deadlocks. (A complete discussion on the various synchronization mechanisms available in .NET is beyond the scope of this article.)

A race condition makes predicting the outcome of the execution of code unpredictable. The outcome depends on which of the threads reaches a section of code first. Due to this unpredictable nature of a race condition, the results will vary upon execution of code multiple times. The .NET Framework provides various techniques for overcoming race conditions. (A complete discussion of these techniques is beyond the scope of this article.)

Increasing the number of threads
The ThreadPool class provided by the .NET Framework limits the number of threads that can be spawned per processor; the default is 25. For certain scenarios where this number is not enough, the number of threads can be increased by using the CorSetMaxThreads, which is defined in mscoree.h.

This facility would come in handy in scenarios where the application may be failing to scale up because of this limitation. In such scenarios, increasing the ThreadPool size will allow the system to spawn more than 25 threads per processor.

The first step in achieving this is to declare a class CorRuntimeHost and use the COMImportAttribute to import the class specifying the Guid of the class. The interface for the ThreadPool is also imported using the Guid and is specified as IUnknown. The method signatures of the various methods that are part of the CLR ThreadPool interface are added to the interface definition.

This functionality is now wrapped in the class, CLRInternalThreadPool, shown in Listing B. This class instantiates the CorRuntimeHost and obtains the reference to the CLR ThreadPool object. The Main method demonstrates how the number of worker threads can be increased.

The developer must keep in mind that there are certain drawbacks to this approach. There is an overhead associated with performing a context-switch between multiple threads. When the number of threads is set to an exorbitant value, then the performance will in fact deteriorate since the system would spend more time in switching between the threads rather than performing the task scheduled.

Through the ThreadPool, the .NET Framework unleashes the power of multithreading to programmers to achieve higher levels of performance in their applications. The code included in this article should give you a basic introduction to the power of the ThreadPool class.