Created one web API in asp.net core 3.x which is responsible to save data in to Azure service bus queue and that data we will process for reporting.
API load is too high so we decided to save data in-memory for each request. Once data limit increased up to certain limit (>50 count) next request (51) will get all data from memory and save in to service bus in one go and clear the memory cache.
for sequential request all logic works fine but when load coming in parallel few data lost and i think it is because of one batch request is taking some time and after that all data problem start.
I did some research and found article and used SemaphoreSlim. It's working fine but is that good approach? As you see in below code I am blocking each request but actually I want to lock when I am processing the batch. I tried to move the lock inside if condition but it was not working. https://medium.com/swlh/async-lock-mechanism-on-asynchronous-programing-d43f15ad0b3
using (await lockThread.LockAsync())
{
var topVisitedTiles = _service.GetFromCache(CacheKey);
if (topVisitedTiles?.Count >= 50)
{
topVisitedTiles?.Add(link);
await _service.AddNewQuickLinkAsync(topVisitedTiles);
_service.SetToCache(CacheKey, new List<TopVisitedTilesItem>());
return Ok(link.Title);
}
topVisitedTiles?.Add(link);
_service.SetToCache(CacheKey, topVisitedTiles);
}
return Ok(link.Title);
I got something from research that concurrentbag and blockingcollection help but I am not aware how can i use in my case. Your small direction will help me.