I'm having some trouble configuring nginx to share files between three sites. I know I could solve the problem by simply giving each site it's own copy of the file or using symlinks. According to the nginx documentation this should be relatively trivial to configure.
All three of the sites (site1.com, site2.net, site3.net) should serve the same robots.txt which is located in /webdata/common/robots.txt.
All of the sites also share 404 and 500 series error pages, which are also located in /webdata/common.
Lastly, all of the sites also share a directory tree /.dir1/dir2/ so the files www.site1.com/.dir1/dir2/file, www.site2.net/.dir1/dir2/file, www.site3.net/.dir1/dir2/file all point to /webdata/type/.dir1/dir2/file
The error pages work perfectly, however the robots.txt and all /.dir1/dir2/files generate 404 errors.
I have checked and rechecked that nginx has access to read all the files below the /webdata directory.
Can anyone point out what I'm doing wrong?
Any assistance would be most appreciated.
/etc/nginx/nginx.conf
user nginx nginx; worker_processes 10; error_log /var/log/nginx/error_log info; pid /var/run/nginx.pid; worker_rlimit_nofile 8192; events { worker_connections 1024; use epoll; } http { include /etc/nginx/mime.types; index index.html; default_type application/octet-stream; log_format main '$remote_addr - $remote_user [$time_local] ' '"$request" $status $bytes_sent ' '"$http_referer" "$http_user_agent" ' '"$gzip_ratio"'; access_log /var/log/nginx/access.log main; client_header_timeout 1m; client_body_timeout 1m; send_timeout 1m; connection_pool_size 256; client_header_buffer_size 1k; large_client_header_buffers 4 2k; request_pool_size 4k; gzip on; gzip_min_length 1100; gzip_buffers 4 8k; gzip_types text/plain; output_buffers 1 32k; postpone_output 1460; sendfile on; tcp_nopush on; tcp_nodelay on; keepalive_timeout 75 20; ignore_invalid_headers on; index index.html; server { listen 123.123.123.123; listen 80; server_name www.site1.com site1.com; access_log /var/log/nginx/site1.com.access_log main; error_log /var/log/nginx/site1.com.error_log info; root /webdata/brivawn.com/www/; location =/robots.txt { root /webdata/common/; internal; } location /.dir1/dir2/ { root /webdata/type/.dir1/dir2/; internal; } error_page 404 /error_404.html; location = /error_404.html { root /webdata/common/; } error_page 500 501 502 503 504 505 506 507 508 509 510 511 520 522 598 599 /error_50x.html; location = /error_50x.html { root /webdata/common/; } } server { listen 123.123.123.123; listen 80; server_name www.site2.net site2.net; access_log /var/log/nginx/site2.net.access_log main; error_log /var/log/nginx/site2.net.error_log info; root /webdata/site2/www/; location =/robots.txt { root /webdata/common/; internal; } location /.dir1/dir2/ { root /webdata/type/.dir1/dir2/; internal; } error_page 404 /error_404.html; location = /error_404.html { root /webdata/common/; } error_page 500 501 502 503 504 505 506 507 508 509 510 511 520 522 598 599 /error_50x.html; location = /error_50x.html { root /webdata/common/; } } server { listen 123.123.123.123; listen 80; server_name www.site3.net site3.net; access_log /var/log/nginx/site3.net.access_log main; error_log /var/log/nginx/site3.net.error_log info; root /webdata/site3/www/; location =/robots.txt { root /webdata/common/; internal; } location /.dir1/dir2/ { root /webdata/type/.dir1/dir2/; internal; } error_page 404 /error_404.html; location = /error_404.html { root /webdata/common/; } error_page 500 501 502 503 504 505 506 507 508 509 510 511 520 522 598 599 /error_50x.html; location = /error_50x.html { root /webdata/common/; } } }
nginx -V
nginx version: nginx/1.8.0 built with OpenSSL 1.0.2d 9 Jul 2015 TLS SNI support enabled configure arguments: --prefix=/usr --conf-path=/etc/nginx/nginx.conf --error-log-path=/var/log/nginx/error_log --pid-path=/run/nginx.pid --lock-path=/run/lock/nginx.lock --with-cc-opt=-I/usr/include --with-ld-opt=-L/usr/lib64 --http-log-path=/var/log/nginx/access_log --http-client-body-temp-path=/var/lib/nginx/tmp/client --http-proxy-temp-path=/var/lib/nginx/tmp/proxy --http-fastcgi-temp-path=/var/lib/nginx/tmp/fastcgi --http-scgi-temp-path=/var/lib/nginx/tmp/scgi --http-uwsgi-temp-path=/var/lib/nginx/tmp/uwsgi --with-ipv6 --with-pcre --with-http_realip_module --with-http_ssl_module --without-mail_imap_module --without-mail_pop3_module --without-mail_smtp_module --user=nginx --group=nginx
– DebOfTheWeb Dec 23 '15 at 05:29
`location /robots.txt { return 200 "User-agent: *\nDisallow: /"; }` That still didn't work. `2015/12/23 22:51:19 [error] 2523#0: *12 open() "/wedata/site1.com/robots.txt" failed (2: No such file or directory), client: 999.999.999.999, server: site1.com, request: "GET /robots.txt HTTP/1.1", host: "site1.com"` – DebOfTheWeb Dec 23 '15 at 23:14