As we’ve seen in previous articles, the .NET Framework includes easy-to-use support for creating multithreaded applications. But developers new to free-threaded programming techniques should exercise caution, as the potential for hard-to-find errors looms large for the unwary. These errors, called synchronization errors, are caused when multiple threads attempt to access the same shared resource at the same time.
Synchronize your watches
Although a sound design can help, it’s difficult to exploit the benefits of a free-threaded design and avoid the potential for synchronization problems completely. What you need, then, is some mechanism for preventing more than one thread from entering sections of your code that work with shared resources concurrently. Luckily, .NET provides several synchronization tools you can use to protect these critical sections of your code. In this article, we’ll look at some of these tools (there are more in the System.Threading namespace), and I’ll provide you with some source code that demonstrates the use of each.
Inadvertent interlocked incrementing in Illinois
Even a simple increment or decrement operation is actually three or four assembler-level instructions. So either of these operations can be interrupted before it is completed and cause unpredictable results. The System.Threading.Interlocked class provides two static methods, Increment and Decrement, which combine the steps involved in changing and evaluating a variable into a single atomic operation. Thus, the operation cannot be interrupted if a context switch occurs. This class also exposes Exchange and CompareExchange methods to set a variable’s value immediately or set it based on the result of a comparison, respectively. Check out Listing A for an example.
Context attributes
By using .NET’s <context.Synchronization()> attribute, objects that inherit from ContextBoundObject can be synchronized to allow only one thread at a time to access their members. Obviously, if only one thread can execute code in an object, the risk of a task switch inside a critical section of code is eliminated. This is a simple solution to a potentially complex problem. The downside, of course, is that the attributed class essentially becomes single-threaded, which would be rather like killing the patient to cure the disease. As such, it should be considered a solution only if your only intent is to prevent free-threaded use of the protected object.
Hall monitor
.NET’s Monitor object is a big step up in sophistication from marking an entire class with the Synchronization attribute. Monitor allows you to mark sections of code between calls to the static Monitor.Enter and Monitor.Exit methods as critical sections. The Monitor.Enter method accepts an argument of some reference type that is assigned to the calling thread. The actual object you pass to Monitor.Enter can be whatever you like; it doesn’t necessarily have to be a reference to the class that contains the Monitor. When a thread encounters a Monitor.Enter-marked block of code with an object currently assigned to another thread—meaning that the block is being executed on another thread—it will be suspended until the other thread calls Monitor.Exit.
Monitor offers the advantages of being simple, flexible enough to operate on blocks of code rather than entire classes or methods, and fast compared to other alternatives. The SyncLock and lock keywords in VB.NET and C# respectively, both implement Monitors for you behind the scenes. In Listing B, you’ll find a simple example illustrating the use of Monitor.
Semaphore flags
The Win32 API users among you will notice similarities between the Monitor object and the Win32 API’s Semaphore object. The two are functionally equivalent.
A Mutex? Is that some kind of bird?
While Monitor and its siblings are pretty flexible, they all lack two important features. First, it’s not easy to set a Monitor from outside the currently executing thread. Second, if one thread is suspended at the entrance to a Monitor block that was set by another thread that has subsequently been suspended at the entrance to a Monitor block set by the first thread, both threads will be suspended indefinitely, each waiting for the other to release it. This is a situation known as deadlock.
A Mutex (which is short for mutually exclusive), on the other hand, is flexible enough to handle both of these situations. Mutexes work on the concept of signaled and unsignaled states. A Mutex is signaled if no other thread currently owns it; it’s unsignaled if it is owned by a particular thread. I know, that seems backwards somehow, but it made sense to the anonymous engineer who came up with it. To unsignal a Mutex, a thread calls the WaitOne method on a Mutex instance. Any other thread that then attempts to unsignal the same Mutex instance will be blocked until the first thread has signaled the Mutex by calling ReleaseMutex.
Same old Mutex
For the Win32 API initiated, this is the same Mutex you may have used in the past with WaitForSingleObject and its relatives. The WaitOne method is essentially an object method wrapper for WaitForSingleObject.
Doesn’t using a Mutex in this fashion still leave you open to the potential for deadlock? Yes, it does, and I wasn’t lying when I said Mutexes allow for deadlocking. The WaitOne method accepts a timeout interval (in either System.Timespan or integer millisecond format). If this timeout interval is exceeded before the Mutex is signaled, the WaitOne method will return false, so your code can determine whether it timed out or not by examining the return value. You can see this in action in Listing C, which also illustrates how a Mutex can be shared across threads.
What about performance?
Is there a performance cost associated with using these synchronization methods? Yes, there is always a performance cost associated with using synchronization; how much varies with the method used. Generally, you can expect Mutexes to exact the heaviest performance penalty: That’s the price you pay for the flexibility they offer you. The attribute-based and Interlocked solutions will usually give you the least performance hit, while Monitor, Lock, and SyncLock will fall somewhere in between.
If you’ve followed this series from its beginning, you should be somewhat comfortable with .NET’s free-threading model. Armed with the synchronization tools I’ve introduced you to in this final installment, you should no longer experience abject fear when confronted with a free-threaded application. A little healthy respect is all that’s called for.
A stitch in time
What tips or advice do you have for someone wanting to develop a multithreaded application? Send us an e-mail with your suggestions and experiences.