Working with Thread Synchronization and Semaphores in C#

A process is an instance of a program that is currently in execution and has many threads of execution. A thread is the fundamental unit of CPU activity. It comprises a program counter that specifies the next instruction to execute, a stack area, and a set of registers. A process can have several paths of execution, i.e., it can have several threads.

Thread synchronization and semaphores are two of the most important concepts related to threads. This article discusses threads, thread synchronization, semaphore in detail, and how you can work with threads in C#.

What is Thread Synchronization?

Data inconsistency occurs when many threads access a shared resource at a given point in time. Let us understand this with an example. Imagine that three are two threads T1 and T2, accessing a shared resource R simultaneously. If T1 is trying to read data from the shared resource R when the other thread T2 is attempting to write data onto it, there would be data inconsistency. Here’s precisely where thread synchronization comes to the rescue.

Thread Synchronization is a mechanism used to disallow multiple threads from accessing a shared resource at the same point in time.  In other words, thread synchronization can help prevent multiple threads from gaining access to a shared resource simultaneously. As a result, you can have one and only one thread enters a critical section simultaneously.

Read: Working with Files in C#

Why do we need Thread Synchronization?

Thread synchronization helps you achieve the following:

  • Atomicity – Thread synchronization supports atomicity, which ensures that multiple threads in the application are not allowed to access a shared resource concurrently to prevent data inconsistency. The code segment that contains the necessary instructions for this is known as the critical section. These critical sections are executed atomically – by one and only one thread.
  • Ordering – You might often want two or more threads to perform a task in a predictable order or restrict access to shared resources to a particular number of threads only. Usually, you don’t have much control over all this, which is one reason for race conditions. Thread synchronization provides support for ordering so that you can orchestrate your threads to perform the tasks as per your plan.

Exclusive Lock and Non-Exclusive Lock Thread Locks in C#

When a process or a thread wants to access an object, it requests a lock on that object. There are two types of locks that determine access to shared resources – exclusive lock and non-exclusive lock.

Exclusive lock

An exclusive lock makes sure that only one thread can gain access or enter a critical section at any given point in time. You can implement exclusive locks using the lock keyword (syntactic sugar for the static methods of the Monitor class) and the Mutex and SpinLock classes in C#. 

Non-exclusive lock

Non-exclusive locks provide read-only access to a shared resource and limit concurrency, i.e., limit the number of concurrent accesses to a shared resource. You can take advantage of the Semaphore, SemaphoreSlim, and ReaderWriterLockSlim classes to implement non-exclusive locks in C#.

What is a Thread Deadlock in C#? Why Does it Occur?

Deadlock is a blocking situation in which a set of concurrent processes share the same resource and are stalled by each other because they are attempting to access a resource already locked by another process. In other words, deadlock is a scenario in which a group of concurrent processes is stalled because one process is holding a resource while waiting for another resource that has already been obtained by another process.

Deadlock can occur if the following conditions hold true:

  • Mutual Exclusion: This implies that only one thread can have access to a resource at a particular time.
  • Hold and Wait: This is a condition in which a thread is holding at least one resource and waiting for a minimum of one resource already acquired by another thread.
  • No Pre-emption: If a thread has acquired a resource, it cannot be taken away from the thread until it relinquishes control of the resource voluntarily.
  • Circular Wait: This is a condition in which two or more threads are waiting for a resource acquired by the next member in the chain.

Let us understand deadlock with an example. Consider two processes P1 and P2 are waiting for one another to relinquish control on the resources R1 and R2 held by each other. Now imagine the following sequence of events:

  1. P1 acquires a lock on R1.
  2. P2 acquires a lock on R2.
  3. P1 attempts to acquire a lock on R2 but since it is already acquired by P2, it blocks until the lock on R2 is released.
  4. P2 attempts to acquire a lock on R1, but since it has already been acquired by P1, it blocks until the lock on R1 is released.

Hence, both the processes are blocked and can’t move ahead locks on the shared resource are relinquished. We can now say that the two processes P1 and P2 are deadlocked!

Note that deadlocks can occur much the same way when you’re working with concurrent threads are accessing a shared resource.

Read: Thread Synchronization Using Reset Events in .NET

How Can You Prevent Deadlocks?

The simplest way to avoid deadlock in C# is by using the Monitor class with a timeout value. The following code snippet illustrates how this can be achieved.

class Program {
  private readonly static object lockObj = new object();
  static void Main(string[] args) {
    try {
      if (Monitor.TryEnter(lockObj, 100)) {
        //This is the critical section
      }
    } catch (Exception ex) {
      //Write your code here to handle exceptions
    } finally {
      Monitor.Exit(lockObj);
    }
  }
}

Implementing Thread Synchronization in C#

You can implement Thread Synchronization using special objects known as synchronization primitives provided by the operating system, as well as, your programming language. You can take advantage of such synchronization primitives in your code to restrict multiple threads from entering the critical section and accessing a shared resource simultaneously.

What is a Mutex?

Mutex is a synchronization primitive used in thread synchronization to grant exclusive access to one and only one thread at any given point in time. Once a thread has acquired a mutex, another thread that attempts to obtain the mutex will have to wait till the previous thread relinquishes control of the mutex. It should be noted that after a thread has finished leveraging a mutex object, it must call the ReleaseMutex method to relinquish control.

The following code snippet illustrates how you can create a mutex and release it when you’re done using it.

    public class Program
    {
        private static Mutex mutex = null;
        public static void Main(string[] args)
        {
            const string applicationName = "DeadlockDemo";
            bool isCreatedNew;
            mutex = new Mutex(true, applicationName, out isCreatedNew);
            if (!isCreatedNew)
            {
                Console.WriteLine("Another instance is already running. 
                Press any key to exit...");
                Console.ReadKey();
                return;
            }
            else
            {
                Console.WriteLine("This is the only instance running. 
                Press any key to exit...");
                Console.ReadKey();
            }
            mutex.ReleaseMutex();
        }

Waiting on a mutex implies waiting until you can get it. The WaitOne method returns true if the mutex can be obtained within the specified duration, false otherwise.

Read: Application Pools and Application Domains in C#

What is a Semaphore?

A Semaphore can restrict the number of concurrent threads that may access a shared resource and implement non-exclusive locking. You can take advantage of a semaphore to control how many consumers may access a specific shared resource simultaneously.

The following code listing illustrates how you can work with semaphores in C#.

class Program
    {
        public static Semaphore semaphore = new Semaphore(3, 5);
        public static void Main(string[] args)
        {
            for (int i = 0; i < 10; i++)
            {
                Thread threadObject = new Thread(Process);
                threadObject.Name = "Thread: " + i;
                threadObject.Start();
            }
            Console.ReadLine();
        }
        private static void Process()
        {
            Console.WriteLine("{0} is waiting to enter the critical 
            section.", Thread.CurrentThread.Name);
            semaphore.WaitOne();
            Console.WriteLine("{0} is inside the critical section now...",
            Thread.CurrentThread.Name);
            Thread.Sleep(1000);
            Console.WriteLine("{0} is releasing the critical section...",
            Thread.CurrentThread.Name);
            semaphore.Release();
        }
    }

When you run the above program, here’s how the output will look like:

Thread Synchronization in C#

Summary of Thread Synchronization in C#

When working with multi-threading and thread synchronization, you must be aware of the performance drawbacks. In multi-threading, context switching is a costly operation. Likewise, thread synchronization constructs like locks also have performance drawbacks. So, you should use them only if your application needs them.

Joydip Kanjilal
Joydip Kanjilal
A Microsoft Most Valuable Professional in ASP.NET, Speaker, and Author of several books and articles. More than 25 years of experience in IT with more than 18 years in Microsoft .NET and its related technologies. He was selected as a Community Credit Winner at http://www.community-credit.com several times. He has authored 8 books and more than 500 articles in some of the most reputed sites worldwide including MSDN, Info World, CodeMag, Tech Beacon, Tech Target, Developer, CodeGuru, and more.

More by Author

Get the Free Newsletter!

Subscribe to Developer Insider for top news, trends & analysis

Must Read