0

I have a Azure Cloud Service based HTTP API which is currently serving its data out of an Azure SQL database. We also have a in role cache at the WebRole side.

Generally this model is working fine for us but sometimes what happening is that we get a large number of requests for the same resource within a short period of time span and if that resource is not there in the cache, all the requests went directly to our DB which is a problem for us as many time DB is not able to take that much load.

By looking at the nature of the problem, it seems like it should be a pretty common problem which most of the people build API would face. I was thinking if somehow, I can send only 1st request to DB and hold all the remaining till the time when 1st one completes, to control the load going to DB but I did get any good of doing it. Is there any standard/recommended way of doing it in Azure/IIS?

Ravi Gupta
  • 6,258
  • 17
  • 56
  • 79
  • How long is the this short period of time and how long does it take to retrieve the resource from SQL db? – Igorek Aug 25 '14 at 05:04
  • like in a period of 5 min we got 800 requests for the same resource. Also, the query in this case is pulling a lot of data from the SQL hence taking around 6-8sec of time. – Ravi Gupta Aug 25 '14 at 09:15

1 Answers1

0

The way we're handling this kind of scenario is by putting calls to the DB in a lock statement. That way only one caller will hit the DB. Here's pseudo code that you can try:

        var cachedItem = ReadFromCache();
        if (cachedItem != null)
        {
            return cachedItem;
        }
        lock(object)
        {
            cachedItem = ReadFromCache();
            if (cachedItem != null)
            {
                return cachedItem;
            }
            var itemsFromDB = ReadFromDB();
            putItemsInCache(itemsFromDB);
            reurn itemsFromDB;
        }
Gaurav Mantri
  • 128,066
  • 12
  • 206
  • 241
  • But it will only work for a single instance and not across multiple instance of WebRole... right? Are you guys okay in sending one requests from each web role? – Ravi Gupta Aug 25 '14 at 10:42
  • Not if your cache is shared. If the cache is local then I would agree with you but your cache is shared by all instances. Correct? – Gaurav Mantri Aug 25 '14 at 10:57
  • Our cache is shared only. Looks like I didn't explained correct what I meant by my above comment. Let suppose you have 3 instance of web role running in your service. Now there are 3 requests for the same resource at the same time landed on your service and each one of them went to 3 different instances. Now all your 3 instances will go to DB to fetch the resource as your lock object is not shared across WebRole instances. Correct? – Ravi Gupta Aug 25 '14 at 11:54
  • I see. I agree that above approach will result in 3 DB hits if the request hits 3 different instances at the same time. – Gaurav Mantri Aug 25 '14 at 12:24