Labels

Monday, June 15, 2009

Threading || Thread Pooling

Thread Pooling

 

How Thread Pooling is better than Wait Handles –

 

If your application has lots of threads that spend most of their time blocked on a Wait Handle, you can reduce the resource burden via Thread Pooling.

A thread pool economizes by coalescing many Wait Handles onto a few threads.

 

Approaches

 

Three ways to implement Thread Pooling -

 

  • ThreadPool.RegisterWaitForSingleObject
  • ThreadPool.QueueUserWorkItem
  • Asynchronous Delegates

 

 

ThreadPool.RegisterWaitForSingleObject

 

To use the thread pool, you register a Wait Handle along with a delegate to be executed when the Wait Handle is signaled.

This is done by calling ThreadPool.RegisterWaitForSingleObject.

 

 

 

    class Test

    {

        static ManualResetEvent starter = new ManualResetEvent(false);

        public static void Main()

        {

            ThreadPool.RegisterWaitForSingleObject(starter, Go, "hello", -1, true);

            Thread.Sleep(5000);

            Console.WriteLine("Signaling worker...");

            starter.Set();

            Console.ReadLine();

        }

        public static void Go(object data, bool timedOut)

        {

            Console.WriteLine("Started " + data);

            // Perform task...

        }

    }

 

Output –

 

(5 second delay)

Signaling worker...

Started hello

 

 

 

 

 

All pooled threads are background threads, meaning they terminate automatically when the application's foreground thread(s) end.

However if one wanted to wait until any important jobs running on pooled threads completed before exiting an application, calling Join on the threads would not be an option, since pooled threads never finish! The idea is that they are instead recycled, and end only when the parent process terminates. So in order to know when a job running on a pooled thread has finished, one must signal – for instance, with another Wait Handle.

 

 

ThreadPool.QueueUserWorkItem

 

You can also use the thread pool without a Wait Handle by calling the QueueUserWorkItem method – specifying a delegate for immediate execution.

Here you lose the saving of sharing threads amongst multiple jobs, but do get another benefit: the thread pool keeps a cap on the total number of threads (25, by default), automatically enqueuing tasks when the job count goes above this. It's rather like an application-wide producer-consumer queue with 25 consumers!

 

In the following example, 100 jobs are enqueued to the thread pool, of which 25 execute at a time.

 

 

class Test

    {

        static object workerLocker = new object();

        static int runningWorkers = 100;

        public static void Main()

        {

            for (int i = 0; i < runningWorkers; i++)

            {

                ThreadPool.QueueUserWorkItem(Go, i);

            }

            Console.WriteLine("Waiting for threads to complete...");

            Console.WriteLine("Complete!");

            Console.ReadLine();

        }

        public static void Go(object instance)

        {

            Console.WriteLine("Started: " + instance);

            Thread.Sleep(1000);

        }

    }

 

Output –

 

Waiting for threads to complete...

Complete!

Started: 1

Started: 2

……

Started: 99

 

 

 

 

Here notice that the main thread finishes off before the queued things gets completed. However, If you are interested for holding the main thread till the the queued things gets completed, suggested is to use  Wait and Pulse. The main thread then waits until they're all complete using Wait and Pulse.

 

 

 

class Test

    {

        static object workerLocker = new object();

        static int runningWorkers = 100;

        public static void Main()

        {

            for (int i = 0; i < runningWorkers; i++)

            {

                ThreadPool.QueueUserWorkItem(Go, i);

            }

            Console.WriteLine("Waiting for threads to complete...");

            lock (workerLocker)

            {

                while (runningWorkers > 0)

                {

                    Monitor.Wait(workerLocker);

                }

            }

            Console.WriteLine("Complete!");

            Console.ReadLine();

        }

        public static void Go(object instance)

        {

            Console.WriteLine("Started: " + instance);

            Thread.Sleep(1000);

            lock (workerLocker)

            {

                runningWorkers--;

                Monitor.Pulse(workerLocker);

            }

        }

    }

 

Output –

 

Waiting for threads to complete...

Started: 1

Started: 2

……

Started: 99

Complete!

 

 

 

 

 

Asynchronous Delegates

 

Asynchronous delegates also provide another way into the thread pool.

 

  • It enables two way communications  i.e it enables to get return values back from a thread when it finishes executing.  Asynchronous delegates offer a convenient mechanism for this, allowing any number of typed arguments to be passed in both directions.
  • Furthermore, unhandled exceptions on asynchronous delegates are conveniently re-thrown on the original thread, and so don't need explicit handling.

 

 

 

class Test

    {

        delegate string DownloadString(string uri);

        static void ComparePages()

        {

            DownloadString download1 = new WebClient().DownloadString;

            DownloadString download2 = new WebClient().DownloadString;

           

            // Start the downloads:

            IAsyncResult cookie1 = download1.BeginInvoke(uri1, null, null);

            IAsyncResult cookie2 = download2.BeginInvoke(uri2, null, null);

 

            // Perform some random calculation:

            double seed = 1.23;

            for (int i = 0; i < 1000000; i++) seed = Math.Sqrt(seed + 1000);

         

            string s1 = download1.EndInvoke(cookie1);

            string s2 = download2.EndInvoke(cookie2);

            Console.WriteLine(s1 == s2 ? "Same" : "Different");

        }

    }

 

 

 

 

 

Asynchronous Methods

 

Some types in the .NET Framework offer asynchronous versions of their methods, with names

starting with "Begin" and "End". These are called asynchronous methods and have signatures

similar to those of asynchronous delegates, but exist to solve a much harder problem: to allow

 

more concurrent activities than you have threads. A web or TCP sockets server, for instance, can

process several hundred concurrent requests on just a handful of pooled threads if written using

NetworkStream.BeginRead and NetworkStream.BeginWrite.

Unless you're writing a high concurrency application, however, you should avoid asynchronous

methods for a number of reasons:

· Unlike asynchronous delegates, asynchronous methods may not actually execute in

parallel with the caller

· The benefits of asynchronous methods erodes or disappears if you fail to follow the

pattern meticulously

· Things can get complex pretty quickly when you do follow the pattern correctly

If you're simply after parallel execution, you're better off calling the synchronous version of the

method (e.g. NetworkStream.Read) via an asynchronous delegate. Another option is to use

ThreadPool.QueueUserWorkItem or BackgroundWorker—or simply create a new thread.

Chapter 20 of C# 3.0 in a Nutshell explains asynchronous methods in detail.

 

 

Hope this helps.

 

Thanks & Regards,

Arun Manglick || Senior Tech Lead

 

 

No comments:

Post a Comment