Let's suppose that we have 10K requests per second to our php script.
Every request is checking cache in memcached (or any other cache storage). If cache is found - everything is ok and cache value is returned. If cache is not found we're making a slow SQL query to fill the cache. It is the most common and simple caching scheme:
$result = $this->loadFromCache($key);
if (empty($result)) {
$result = $this->makeSlowSqlQuery();
$this->writeToCache($key, $result);
}
//do something with $result;
This scheme works good until we don't have too much requests. As soon as we have too much requests we will face situation when big number of requests will not found anything in cache and will try to refill it. So all of them will start executing slow SQL query and it will cause high load impact. What is the solution?
As possible solution I see the following scenario: first request that found cache invalid should create some trigger saying that cache refilling is already started and another request just should wait for new cache or use older (previous) version.
How do you solve similar problems?