0

Created one web API in asp.net core 3.x which is responsible to save data in to Azure service bus queue and that data we will process for reporting.

API load is too high so we decided to save data in-memory for each request. Once data limit increased up to certain limit (>50 count) next request (51) will get all data from memory and save in to service bus in one go and clear the memory cache.

for sequential request all logic works fine but when load coming in parallel few data lost and i think it is because of one batch request is taking some time and after that all data problem start.

I did some research and found article and used SemaphoreSlim. It's working fine but is that good approach? As you see in below code I am blocking each request but actually I want to lock when I am processing the batch. I tried to move the lock inside if condition but it was not working. https://medium.com/swlh/async-lock-mechanism-on-asynchronous-programing-d43f15ad0b3

 using (await lockThread.LockAsync())
            {
                var topVisitedTiles = _service.GetFromCache(CacheKey);
                if (topVisitedTiles?.Count >= 50)
                {
                    topVisitedTiles?.Add(link);
                    await _service.AddNewQuickLinkAsync(topVisitedTiles);
                    _service.SetToCache(CacheKey, new List<TopVisitedTilesItem>());
                    return Ok(link.Title);
                }
                topVisitedTiles?.Add(link);
                _service.SetToCache(CacheKey, topVisitedTiles);
            }
            return Ok(link.Title);

I got something from research that concurrentbag and blockingcollection help but I am not aware how can i use in my case. Your small direction will help me.

V_B
  • 1,571
  • 5
  • 16
  • 37
  • 1
    You could remove it, exit the lock, and *then* call `AddNewQuickLinkAsync`. – Stephen Cleary Mar 11 '21 at 18:28
  • @StephenCleary Thanks for your reply. Do you mean remove this AddNewQuickLinkAsync from lock and put outside? But still I have to lock each request and that I don't want. I want lock only when request count more then the limit. – V_B Mar 12 '21 at 08:59
  • 1
    `I want lock only when request count more then the limit.` - I don't think this is possible. You have to lock just to safely `Add` and test whether you're over the limit. So lock the part that needs locking (get+update the cache item), and don't lock the part that doesn't need locking (`AddNewQuickLinkAsync`). – Stephen Cleary Mar 12 '21 at 13:31

1 Answers1

0

You can use Task Parallel Library if you don't want to deep dive into parallel implementations of bags or queues.

In your case something like this can be used

        // Define a buffer block with size = 10
        var batchBlock = new BatchBlock<string>(10);

        // Define an ActionBlock that processes batches received from BatchBlock
        var processingBlock = new ActionBlock<string[]>((messages) =>
        {
            Console.WriteLine("-------------");
            Console.WriteLine($"Number of messages: {messages.Length}");
            Console.WriteLine($"Messages: {string.Join(", ", messages)}");
        });

        // Link processing block to batchBloack.
        batchBlock.LinkTo(processingBlock);
        batchBlock.Completion.ContinueWith((t) =>
        {
            processingBlock.Complete();
        });

        var task1 = Task.Run(async () =>
        {
            for (int i = 0; i < 50; i++)
            {
                await batchBlock.SendAsync($"Message {i}");
            }
        });

        var task2 = Task.Run(async () =>
        {
            for (int i = 50; i < 100; i++)
            {
                await batchBlock.SendAsync($"Message {i}");
            }
        });

        await Task.WhenAll(task1, task2);

        // Complete pipeline. You can leave it as active if you want.
        batchBlock.Complete();
        processingBlock.Completion.Wait();
ivan_k
  • 114
  • 4