Multithreading in .NET Applications, Part 2

Mark Strawmyer Presents: .NET Nuts & Bolts


Multithreading is a powerful design tool for creating high performance applications, especially those that require user interaction. Microsoft .NET has broken down the barriers that once existed in creating multithreaded applications. The last installment of the .NET Nuts & Bolts column was Part 1 of the exploration of multithreading with the .NET Framework. In that article, we covered the background of threading, benefits of threading, and provided a demonstration. This article serves as Part 2 of our exploration of multithreading. We will look at the basic methods involved with working with threads, along with the synchronization of thread activity.

Working with Threads

In order to successfully work with threads, we must understand some basic thread operations. A list of the commonly used methods from the System.Threading.Thread class is provided below:

  • Abort raises a ThreadAbortException to tell the thread to terminate. A thread cannot be restarted once it has been terminated.
  • Join blocks the calling thread until this thread terminates
  • Sleep blocks the thread from execution for a specified number of milliseconds
  • Start causes the thread to begin execution

It is important to notice how the list does not include methods to stop or free the thread. This is handled automatically by the .NET common language runtime (CLR) when the thread execution is complete.

Sample Code Listing

The following code contains a sample console-based application. The application creates a new thread that calls a method to display continuous messages to the console window until the thread is complete. The thread could be terminated through a call to the Abort() method, but for this example we'll just loop a certain number of times and stop. The application contains example usage of most of the methods listed above.

using System;
using System.Collections;
using System.Threading;

namespace CodeGuru.MultithreadedPart2
{
  /// <remarks>
  /// Example console application demonstrating the basic methods
  /// used with threads.
  /// </remarks>
  class ThreadExample
  {
    /// <summary>
    /// The main entry point for the application.
    /// </summary>
    [STAThread]
    static void Main(string[] args)
    {
      // Create the thread and indicate to use the DisplayMessage
      // method
      ThreadExample example = new ThreadExample();
      Thread testThread = new Thread(new
                          ThreadStart(example.DisplayMessage));

      // Start the thread
      testThread.Start();

      // Wait until the thread finishes before continuing
      testThread.Join();
      Console.WriteLine("\r\nTest thread has finished");
    }

    /// <summary>
    /// Display a message to the console window.
    /// </summary>
    public void DisplayMessage()
    {
      for( int i = 0; i < 20; i++ )
      {
        Console.WriteLine("DisplayMessage is running in its own
                           thread.");
      }
    }
  }
}

Testing the Sample

Run the sample console application provided above. The output will look roughly as follows:

Synchronizing Threads

Now that we have an understanding of some of the methods involved, we can look at the complex issue of synchronization. When using multiple threads, it is possible that different threads could access the same object simultaneously and leave the object in an invalid state. It is also possible a thread may do something that would interrupt what another thread is doing with a resource. Thus, it is important to be able to control access to blocks of code and resources. This control is known as synchronization and objects that use it correctly are known as thread safe.

Protecting a Code Region

When two or more threads need to access the same object, it is possible that one thread may do something that would interrupt what the other thread is doing. Thus, it is imperative to be able to control access to blocks of code to prevent such items from occurring. This functionality is exposed through the static methods of the System.Threading.Monitor class. The Monitor class is used to synchronize blocks of code, instance methods, and static methods. The locking is performed based on an object, which means it will not lock properly on value types such as int, string, and so on. When the object is locked, no other thread can utilize the object. The region of code is established by a call to Monitor.Enter at the start and released by a call to Monitor.Exit.

Sample Code Listing

The following code contains a sample console-based application. The application creates two separate threads. One of the threads adds numbers into a queue, and the other thread removes numbers from the same queue. The monitor class is used so that the queuing and de-queuing threads are not trying to add and remove from the queue at the same time. The queuing thread is started first and adds an item to the queue. The de-queuing thread is started and is waiting for the queue to be released. When the queue is released, the de-queuing thread removes the item from the queue and then releases the queue. Each thread continues using, releasing, then waiting for the queue again, such that 10 items are added and removed from the queue.

using System;
using System.Collections;
using System.Threading;

namespace CodeGuru.MultithreadedPart2
{
  /// <remarks>
  /// Example console application demonstrating the use of a
  /// monitor.
  /// </remarks>
  class MonitorExample
  {
    // Constant representing the maximum number of items to enqueue
    const int MAX_QUEUE_SIZE = 10;
    // Queue to hold items
    Queue _queue;

    /// <summary>
    /// Constructor
    /// </summary>
    public MonitorExample()
    {
      this._queue = new Queue(); 
    }

    /// <summary>
    /// The main entry point for the application.
    /// </summary>
    [STAThread]
    static void Main(string[] args)
    {
      // Start two threads to add and remove queue items
      MonitorExample test = new MonitorExample();
      Thread enqueueThread = new Thread(new
                                        ThreadStart(test.Enqueue));
      Thread dequeueThread = new Thread(new
                                        ThreadStart(test.Dequeue));
      enqueueThread.Start();
      dequeueThread.Start();

      // Wait to the end of the two threads then print the number of
      // queue elements
      enqueueThread.Join();
      dequeueThread.Join();
    }

    /// <summary>
    /// Add items to the queue.
    /// </summary>
    public void Enqueue()
    {
      int counter = 0;

      Monitor.Enter(this._queue);
      while( counter < MAX_QUEUE_SIZE )
      {
        // Wait while the queue is busy
        Monitor.Wait(this._queue);

        // Add one item to the queue
        Console.WriteLine("Adding item {0} to queue",
                           counter.ToString());
        this._queue.Enqueue(counter++);

      // Release the queue
      Monitor.Pulse(this._queue);
    }
    Monitor.Exit(this._queue);
  }

    /// <summary>
    /// Remove items from the queue.
    /// </summary>
    public void Dequeue()
    {
      Monitor.Enter(this._queue);

      // Release the queue because we just locked it
      // setting up the region lock
      Monitor.Pulse(this._queue);

      // Wait while the queue is busy
      // and exit on the timeout when the first thread stopped
      while(Monitor.Wait(this._queue, 500))
      {
        // Remove the first item and display it
        int counter = (int)this._queue.Dequeue();
        Console.WriteLine("Removed item {0} from queue",
                           counter.ToString());

        // Release the queue
        Monitor.Pulse(this._queue);
      }
      Monitor.Exit(this._queue);
    }
  }
}

Testing the Sample

Run the sample console application provided above. The output will look roughly as follows:

Protecting Operating System Resources

When two or more threads within or across processes need to access an operating system resource, there needs to be a control to limit conflicting access of the resource. Otherwise, it would be possible for one thread to do something to interrupt another. System.Threading.WaitHandle provides the base class for controlling exclusive access to operating system specific resources. It is used for synchronizing resources between managed and unmanaged code and exposes operating system specific functionality such as waiting on multiple resources. The classes derived from WaitHandle must implement a signaling instrument to indicate taking or releasing exclusive access to a resource.

System.Threading.Mutex is a class that is derived from WaitHandle. Only one thread at a time can own a mutex. Prior to accessing the resource, each thread tries to gain ownership of the mutex using one of the request signaling methods. If the mutex is owned, each thread waits for ownership of the resource before continuing. When the thread is done with the mutex, it signals completion through a call to the ReleaseMutex() method.

Sample Code Listing

The following code contains a sample console-based application. The application creates five different threads that all compete to use the same simulated system resource. Each thread is forced to wait until it can have exclusive access to the resource before it can continue.

using System;
using System.Threading;

namespace CodeGuru.MultithreadedPart2
{
  /// <remarks>
  /// Example console application demonstrating the use of a mutex.
  /// </remarks>
  class MutexExample
  {
    // Control access to the resource
    private static Mutex _mutex = new Mutex();

    /// <summary>
    /// The main entry point for the application.
    /// </summary>
    [STAThread]
    static void Main(string[] args)
    {
      // Create processing threads to use the system resource
      for(int i = 0; i < 5; i++)
      {
        // Call SomeProcess on a new thread
        Thread thread = new Thread(new ThreadStart(SomeProcess));
        thread.Name = String.Format("Thread{0}", i + 1);
        thread.Start();
      }
    }

    /*
     * Represents some process to consume a resource.
     */
    private static void SomeProcess()
    {
      UseSimulatedResource();
    }

    /*
     * Represents a system resource that must be synchronized.
     */
    private static void UseSimulatedResource()
    {
      // Wait until it is safe to use the resource
      Console.WriteLine("{0} waiting on resource",
                         Thread.CurrentThread.Name);
      _mutex.WaitOne();
      Console.WriteLine("{0} has resource",
                         Thread.CurrentThread.Name);

      // Put the thread to sleep to pretend we did something
      Thread.Sleep(1000);

      // Release the resource
      Console.WriteLine("{0} done with resource\r\n",
                         Thread.CurrentThread.Name);
      _mutex.ReleaseMutex();
    }
  }
}

Testing the Sample

Run the sample console application provided above. The output will vary according to how quickly each thread starts and the order the resource is requested. The result is likely to vary slightly across iterations. The output will look roughly as follows:

Possible Enhancements

Despite the coverage I've done in Parts 1 and 2 of the multithreaded exploration, there are still other topics to cover. Some of the additional topics that you should consider exploring for yourself are as follows:

  • C# has a lock statement (SyncLock in VB.NET) that can be used in place of the Monitor.Enter and Monitor.Exit methods. Rather than have the Monitor.Enter at the beginning of the code region and the Monitor.Exit at the end, the lock encapsulates all the code within a block using braces {}.
  • A condition known as a deadlock can occur when two threads are waiting for the same resource. This is a critical condition that you must take care to avoid when synchronizing resources. A common way for a deadlock to occur is for A to be waiting on B to complete and at the same time B is waiting on A to complete. To see an example of a deadlock condition, comment out the line of code in the Dequeue method of the Monitor example where the Monitor.Pulse(this._queue); occurs just after the lock statement. This will make the enqueueThread wait on the Queue resource to be released and the dequeueThread resource to wait on the enqueueThread to do something.
  • A common scenario is to create a thread that spends much of its lifetime in a sleeping state or waiting for an event to occur. This can lead to inefficient use of resources. A ThreadPool is an object that allows you to be more efficient by having a thread that monitors wait operations for status changes. When a wait operation completes, a worker thread from the thread pool executes the appropriate callback function to resume execution of the thread. This allows for more efficient use of threads and resources.

Future Columns

Due to the requests and suggestions I received for Part 1 of multithreading, the next column is going to be Part 3 on multithreading. It will cover using the classes in the System.Net and System.Threading in combination to create a server-based listening application that processes requests using multiple threads. If you have something in particular that you would like to see explained here, you can reach me at mstrawmyer@crowechizek.com.

About the Author

Mark Strawmyer, MCSD, MCSE (NT4/W2K), MCDBA is a Senior Architect of .NET applications for large- and mid-size organizations. Mark is a technology leader with Crowe Chizek in Indianapolis, Indiana. He specializes in architecture, design, and development of Microsoft-based solutions. You can reach Mark at mstrawmyer@crowechizek.com.

# # #



About the Author

Mark Strawmyer

Mark Strawmyer is a Senior Architect of .NET applications for large and mid-size organizations. He specializes in architecture, design and development of Microsoft-based solutions. Mark was honored to be named a Microsoft MVP for application development with C# for the fifth year in a row. You can reach Mark at mark.strawmyer@crowehorwath.com.

Comments

  • There are no comments yet. Be the first to comment!

Leave a Comment
  • Your email address will not be published. All fields are required.

Top White Papers and Webcasts

  • Live Event Date: April 29, 2014 @ 2:00 p.m. ET / 11:00 a.m. PT Leveraging Flash storage to accelerate an Oracle Real Application Clusters (RAC) environment is one of today's hottest technology topics. Oracle databases require guaranteed levels of storage performance and high availability of data. Until recently, Oracle RAC could only use Flash storage for SmartFlash Cache, which addresses some performance improvements, but limits the benefits that can be gained from a shared Flash infrastructure. Enter …

  • On-demand Event Event Date: March 27, 2014 Teams need to deliver quality software faster and need integrated agile planning, task tracking, source control, auto deploy with continuous builds and a configurable process to adapt to the way you work. Rational Team Concert and DevOps Services (JazzHub) have everything you need to build great software, integrated seamlessly together right out of the box or available immediately in the cloud. And with the Rational Team Concert Client, you can connect your …

Most Popular Programming Stories

More for Developers

Latest Developer Headlines

RSS Feeds