0

I have a queue of jobs which can be populated by multiple threads (ConcurrentQueue<MyJob>). I need to implement continuous execution of this jobs asynchronously(not by main thread), but only by one thread at the same time. I've tried something like this:

public class ConcurrentLoop {
    private static ConcurrentQueue<MyJob> _concurrentQueue = new ConcurrentQueue<MyJob>();

    private static Task _currentTask;
    private static object _lock = new object();

    public static void QueueJob(Job job)
    {
        _concurrentQueue.Enqueue(job);
        checkLoop();
    }

    private static void checkLoop()
    {
        if ( _currentTask == null || _currentTask.IsCompleted )
        {
            lock (_lock)
            {
                if ( _currentTask == null || _currentTask.IsCompleted )
                {
                    _currentTask = Task.Run(() =>
                    {
                            MyJob current;
                            while( _concurrentQueue.TryDequeue( out current ) ) 
                                //Do something                                                       
                    });
                }
            }
        }
    }
}

This code in my opinion have a problem: if task finnishing to execute(TryDequeue returns false but task have not been marked as completed yet) and in this moment i get a new job, it will not be executed. Am i right? If so, how to fix this

madmanul
  • 420
  • 4
  • 14
  • Is there a real reason you're trying to limit the number of threads? It's better to let the framework handle optimization for you most of the time – konkked Sep 19 '16 at 22:24
  • Use Lock on the block of code that you want to execute by one thread at a time. This should resolve the issue. – BetterLateThanNever Sep 19 '16 at 22:29
  • lock with `ConcurrentQueue`? I belive there is a better way – madmanul Sep 19 '16 at 22:30
  • @konkked, yes there is. – madmanul Sep 19 '16 at 22:31
  • @xalz ...okay, what is it? – konkked Sep 19 '16 at 22:32
  • _"only by one thread at the same time"_ -- the obvious answer is, have only a single thread consume the queue. Did you try anything? Please provide a good [mcve] that shows clearly what you've tried, along with a detailed and specific explanation of what that code does and what you want it to do instead. – Peter Duniho Sep 19 '16 at 22:33

2 Answers2

3

Your problem statement looks like a producer-consumer problem, with a caveat that you only want a single consumer.

There is no need to reimplement such functionality manually. Instead, I suggest to use BlockingCollection -- internally it uses ConcurrentQueue and a separate thread for the consumption. Note, that this may or may not be suitable for your use case.

Something like:

_blockingCollection = new BlockingCollection<your type>(); // you may want to create bounded or unbounded collection
_consumingThread = new Thread(() =>
{
    foreach (var workItem in _blockingCollection.GetConsumingEnumerable()) // blocks when there is no more work to do, continues whenever a new item is added.
    {
      // do work with workItem
     }
});
_consumingThread.Start();

Multiple producers (tasks or threads) can add work items to the _blockingCollection no problem, and no need to worry about synchronizing producers/consumer.

When you are done with producing task, call _blockingCollection.CompleteAdding() (this method is not thread safe, so it is advised to stop all producers beforehand). Probably, you should also do _consumingThread.Join() somewhere to terminate your consuming thread.

undefined
  • 1,354
  • 1
  • 8
  • 19
0

I would use Microsoft's Reactive Framework Team's Reactive Extensions (NuGet "System.Reactive") for this. It's a lovely abstraction.

public class ConcurrentLoop
{
    private static Subject<MyJob> _jobs = new Subject<MyJob>();

    private static IDisposable _subscription =
        _jobs
            .Synchronize()
            .ObserveOn(Scheduler.Default)
            .Subscribe(job =>
            {
                //Do something
            });

    public static void QueueJob(MyJob job)
    {
        _jobs.OnNext(job);
    }
}

This nicely synchronizes all incoming jobs into a single stream and pushes the execution on to Scheduler.Default (which is basically the thread-pool), but because it has serialized all input only one can happen at a time. The nice thing about this is that it releases the thread if there is a significant gap between the values. It's a very lean solution.

To clean up you just need call either _jobs.OnCompleted(); or _subscription.Dispose();.

Enigmativity
  • 113,464
  • 11
  • 89
  • 172