0

I am having an issue where it looks like my ConcurrentQueue in a singleton is not processing items in the correct order. I know it's FIFO, so I am thinking that maybe the queue in memory is not the same somehow or something is going wrong with my Dequeue? The way I am testing this is by firing 3 postman requests to my API endpoint quickly. If anybody can help me understand why these are not running after each other I will be greatly appreciative!

As of now, I am leaning towards the Queue.TryPeek is not working correctly as the 2nd and 3rd requests seem to be queuing before the first one is dequeued.

So when I run the below code, I am seeing the following output in the console.

Queued message: Test 1
Sending message: Test 1
Queued message: Test 2
Sending message: Test 2
Dequeuing message: Test 2
Returning response: Test 2
Queued message: Test 3
Sending message: Test 3
Dequeuing message: Test 1
Returning response: Test 1
Dequeuing message: Test 3
Returning response: Test 3

This is my API controller method that is getting a message and queuing that message, once the message is queued it will wait until it see's that request's message in the front and then send it and then dequeue it.

Controller

[HttpPost]
[Route("message")]
public IActionResult SendMessageUser([FromBody]Message request)
{
    Console.WriteLine($"Queued message: {request.Message}");
    _messageQueue.QueueAndWaitForTurn(request);
    Console.WriteLine($"Sending message: {request.Message}");
    var sendMessageResponse = _messageService.SendMessageToUser(request.Name, request.Message);
    Console.WriteLine($"Dequeuing message: {request.Message}");
    _messageQueue.DequeueMessage(request);
    Console.WriteLine($"Returning response: {request.Message}");
    return Ok(sendMessageResponse);
}

As for the Queue, I am binding it to IoC like so:

public void ConfigureServices(IServiceCollection services)
{
    services.AddSingleton<IMessageQueue, MessageQueue>();
    services.AddScoped<IMessageService, MessageService>();

    services.AddMvc();
}

And this is my Queue class singleton, I am using a Singleton here because I would like there by only 1 instance of this queue throughout the application's lifetime.

public class MessageQueue : IMessageQueue
{
    private Lazy<ConcurrentQueue<Message>> _queue = 
        new Lazy<ConcurrentQueue<Message>>(new ConcurrentQueue<Message>());

    public ConcurrentQueue<Message> Queue
    {
        get
        {
            return _queue.Value;
        }
    }

    public void QueueAndWaitForTurn(Message message)
    {
        Queue.Enqueue(message);

        WaitForTurn();
    }

    public bool DequeueMessage(Message message)
    {
        var messageIsDequeued = Queue.TryDequeue(out message);

        return messageIsDequeued;
    }

    public void WaitForTurn()
    {
        Message myMessage = null;
        var myMessageIsNext = Queue.TryPeek(out myMessage);

        while (!Queue.TryPeek(out myMessage))
        {
            Thread.Sleep(1000);
            WaitForTurn();
        }
    }
}
Theodor Zoulias
  • 34,835
  • 7
  • 69
  • 104
Shawn
  • 2,355
  • 14
  • 48
  • 98
  • 2
    "The 2nd and 3rd requests seem to be queuing before the first one is dequeued." Why is this unexpected? – ProgrammingLlama Jul 11 '18 at 05:13
  • @john The queuing is fine, I am more concerned of the 2nd dequeing before the first one dequeud. I was trying to make it where it would check if the message for that request was in front so the 2nd one should ideally be waiting until the 1st dequeus and then when the 2nd peeks, it should say hey its my turn and send the message and dequeue itself for the 3rd to go next. – Shawn Jul 11 '18 at 05:20
  • 1
    What processes the queue? That API call? – ProgrammingLlama Jul 11 '18 at 05:22
  • Yes, the call to the api is what is enqueuing the messages then waiting for that message to be in the front then send and dequeue – Shawn Jul 11 '18 at 05:29
  • 1
    So you basically want to accept an API call, have it wait for previous calls to complete, and then complete that API call's action? – ProgrammingLlama Jul 11 '18 at 05:33
  • Right, seems not ideal but I had a weird case where this made the most sense of all the resources I looked at to utilize. There should only be ~50 calls at once max and it takes around 2 seconds to send a message once its ready in the queue, and they need to be sent in the order received. But yes, what you said is essentially what I am looking to accomplish. – Shawn Jul 11 '18 at 05:41

1 Answers1

2

I'd create a kind of FifoSemaphore:

public class FifoSemaphore : IDisposable
{
    private readonly SemaphoreSlim _semaphore = new SemaphoreSlim(1, 1);
    private readonly Queue<TaskCompletionSource<object>> _taskQueue = new Queue<TaskCompletionSource<object>>(10);
    private readonly object _lockObject = new object();

    public async Task WaitAsync()
    {
        // Enqueue a task
        Task resultTask;
        lock (_lockObject)
        {
            var tcs = new TaskCompletionSource<object>();
            resultTask = tcs.Task;
            _taskQueue.Enqueue(tcs);
        }

        // Wait for the lock
        await _semaphore.WaitAsync();

        // Dequeue the next item and set it to resolved (release back to API call)
        TaskCompletionSource<object> queuedItem;
        lock (_lockObject)
        {
            queuedItem = _taskQueue.Dequeue();
        }
        queuedItem.SetResult(null);

        // Await our own task
        await resultTask;
    }

    public void Release()
    {
        // Release the semaphore so another waiting thread can enter
        _semaphore.Release();
    }

    public void Dispose()
    {
        _semaphore?.Dispose();
    }
}

And then use it like this:

[HttpPost]
[Route("message")]
public async Task<IActionResult> SendMessageUser([FromBody]Message request)
{
    try
    {
        await _fifoSemaphore.WaitAsync();
        // process message code here
    }
    finally // important to have a finally to release the semaphore, so that even in the case of an exception, it can continue to process the next message
    {
        _fifoSemaphore.Release();
    }
}

The idea is that each waiting item will first be queued.

Next, we wait for the semaphore to let us in (our semaphore allows one item at a time).

Then we dequeue the next waiting item, and release it back to the API method.

Finally, we wait for our own position in the queue to complete and then return to the API method.

In the API method, we asynchronously wait for our turn, do our task, and then return. A try/finally is included to ensure that the semaphore will be released for subsequent messages, even in case of failure.

ProgrammingLlama
  • 36,677
  • 7
  • 67
  • 86
  • 1
    love the idea of using a semaphore, I had to change ```private readonly SemaphoreSlim _semaphore = new SemaphoreSlim(0, 1);``` to ```private readonly SemaphoreSlim _semaphore = new SemaphoreSlim(1, 1);``` as it seemed to not let me pass the semaphore without an initial thread for some reason when the count was 0? Thank you for taking the time to answer this problem for me! – Shawn Jul 11 '18 at 17:44