h3mm3's blog

Stories from behind the keyboard

  • RSS
  • Twitter

Some days ago Raffaele Rialdi spoke at the 18th DotNetMarche Workshop about .NET Parallel Extensions and other goodies – by the way, if you speak Italian you cannot miss the recordings of the event: go here and grab Raffaele’s sessions and Alkampfer’s whole recap about Linq.

In this post I will use the .NET Parallel Library to implement a simple producer/consumer pattern. The scenario is quite simple: a method, a.k.a. the producer, keeps on pushing data inside a collection (in my case, a queue) while one or more methods, a.k.a. the consumers, access - in parallel - the collection and grab their own bits of data. In order to keep things simple, I’m implementing these methods in the form of lambdas.

The queue I’m going to use is a ConcurrentQueue<T>: it is included in a new namespace called System.Collections.Concurrent and it supports parallel computation in a simple and stable way.

var queue = new ConcurrentQueue<int>();

The producer will insert 1000 numbers inside the queue, at intervals of 10 milliseconds. The ForAll() extension method is available to ParallelQuery objects; they, in turn, are the “parallelizable” version of a query or an IEnumerable<T>. In the following code, I use Enumerable.Range() to get one thousand integers, then I convert this IEnumerable<int> to a ParallelQuery<int> and, finally, I set up a parallel computation, pushing the integers into the queue.

Action producer = ()=>{

The consumer is meant to consume the queue, trying to pull out the next number, if available. If the queue looks empty, the consumer returns and writes the number of items it could de-queue. In this simulation, each cycle of the consumer is delayed by random Thread.Sleep(..). The following code declares a Random object and the consumer Action (the integer index the consumer receives is used to label the messages it writes to the Console):

var r = new Random();

Action<int> consumer = idx=>{
  var count = 0;
     int n;
     if (queue.TryDequeue(out n))
  } while (queue.Count>0);

  Console.Write("\n[#{0}: count={1}]",idx,count); 

The following instruction is the magic that starts the producer and – in parallel – activates ten instances of the consumer Action:

Parallel.Invoke(producer, ()=>Parallel.For(0,10,consumer));

All of this stuff is done thanks to the Parallel static class, that belongs to the brand new System.Threading.Tasks namespace. In effect, Parallel.Invoke() is able to launch a parallel computation, running all the actions it receives in input. In this case, it gets only two actions: the producer, and another one (written as a lambda expression) that calls the Parallel.For() method. Parallel.For() is the the asynchronous equivalent of a for(int i;i<max;i++)  statement: in the previous code, it will cycle from 0 (included) to 10 (excluded) and will pass each value to a consumer action, establishing a parallel computation.

You can test the code inside a simple Console application project or – as I did – using LINQPad. The following picture is a sample of the output I got running the code with LINQPad. As you can see, various instances of the consumer alternate each other and grab numbers from the ConcurrentQueue<int>, consuming each datum at different speed - thanks to the random duration Thread.Sleep().

[#0: count=113]
[#9: count=71]
[#1: count=115]
[#8: count=89]
[#2: count=107]
[#5: count=108]
[#4: count=77]
[#6: count=110]
[#3: count=112]
[#7: count=98]

The code described in this post lets you try some of the new classes (and patterns) included in the .NET Parallel Extensions. These classes provide powerful yet simple-to-use methods that let you implement parallel algorithms.

Happy programming!

1 comment:

D_Guidi said...

Can be cool to try using Rx Framework and consume your list with code like:

var obs = list
.BufferWithCount(TimeSpan.FromSeconds(interval), 1);
using (obs.Subscribe(i => {...})