We have 8 REST-ish API servers with Nginx using FastCGI with PHP-FPM to handle requests. We're currently using Nginx' FastCGI caching (directives like fastcgi_cache_path
). This means that API responses are cached, but there is a separate cache for each server.
Is there a good way to share cache storage among all eight servers?
We have considered using Redis as shared storage, but the modules available seem to require application changes. In some cases, we may wish to cache responses outside of our control (over HTTP to external APIs). Ideally, a drop-in replacement for the Nginx built-in caching of FastCGI and HTTP responses would be available.