0

I have the following scenario:

  1. I have multiple instances of a Flask application behind a load balancer serving an API for a front end web client. This flask application interfaces with a database to query and insert data.
  2. Under certain conditions, an event in my system will cause all the clients to make the exact same query to the API which then results in many calls to a database.
  3. In order to prevent unnecessary DB queries, I would like to implement a cache using Redis as the back end storage solution. I'm Using the Flask-Caching library with the RedisCache backend. All instances of the API service share the same Redis backend.

My question is this: Will the Redis backend block processes trying to write and read from the cache? Each request to the cached endpoint can take up to 3 seconds (worst case scenario). While waiting for a response, many additional requests may be made to that endpoint. Will they end up waiting for the cache to be written to by the original request?

I know Redis has a way to create locks, and I would assume that a cache would implement some of lock around cached values, but I cannot find any documentation as to how Flask-Cache interfaces with Redis.

joelghill
  • 3
  • 5

0 Answers0