I am developing on a python webserver (Tornado). I plan to place this in a production instance with nginx in front. This will be my first time placing something into a production environment on my own. My question is how to setup files/directories for static serving. For instance my application, allows users to upload photos to the web. I recieve the requests in Tornado, and save to disk. However when a user visits their items page, I would rather the images be pulled from a static server. My question is what is the best practice for getting the images from my dynamic server to the static server? Do I rsync the image directory to the static server, then run a cron that delete the images from the dynamic server?
Asked
Active
Viewed 1,163 times
1 Answers
0
Best practice is use shared storage, but if can't use it, than you can use "proxy_store" option from nginx. Example from nginx doc:
location /images/ {
root /data/www;
error_page 404 = @fetch;
}
location @fetch {
internal;
proxy_pass http://backend;
proxy_store on;
proxy_store_access user:rw group:rw all:r;
proxy_temp_path /data/temp;
root /data/www;
}

oxyum
- 6,387
- 1
- 15
- 13
-
Could you describe the setup for shared storage..I googled around, but couldn't find an example of how it would be done. Don't need something concrete, pseudo example would work great! – vikash dat Mar 14 '11 at 17:04
-
any network file system can be used as shared storage. NFS for example. – oxyum Mar 15 '11 at 15:51
-
Also some companies use WebDAV as protocol for upload files to shared storage. – oxyum Mar 15 '11 at 15:53