I will context my question...
I have a Node.js web app working fine. I have a load balancer in AWS (Elastic Load Balancing, Classic Mode) with an Auto Scaling Group with multiple EC2 instances.
Each EC2 instance has a Nginx reverse proxy installed which work fine with PM2 and Node.js (Express). I use Nginx to caching static files, caching request, https, and another stuff.
I need to know how can I share a cache between all the instances using Nginx, and not use each one the memory cache in each machine. I would like to use Memcache o Redis, but I need to choose one, I prefer Redis.
This is the example how I'm working in this moment (just important code):
proxy_cache_path /cache/nginx levels=1:2 keys_zone=cache_zone_name:10m;
location / {
#root html;
#index index.html index.htm;
#Config proxy inverse cache
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
# Add cache debugging header
add_header X-Cache-Status $upstream_cache_status;
# Configure cache
proxy_cache cache_zone_name;
proxy_cache_valid any 1m;
proxy_cache_key $scheme$host$request_uri;
}
I have found this question in ServerFault without no answer.
Thanks!
Edit: Important note, I need a solution that does not affect the performance :)