In Chrome when I enter https://www.example.com/robots.txt my robots.txt
file is served and works fine. I'm happy that it works but I'm not sure why it does.
In the config below I thought that my last location
block, location /
was a catch-all that would attempt to serve every file request that didn't match the regex in one of the location
blocks above it. In practice I use this block to serve files through my node.js / Express server.
I didn't include a location ~* \.(txt)$
block therefor I thought location /
would automatically send requests for robots.txt
to node.js which would throw an error or otherwise fail to deliver the file. To my surprise this setup worked fine. Why is that?
server {
include conf.d/listen-443;
server_name www.example.com;
root /srv/example/views/public;
location ~* \.(jpg|png|svg|webp|ico)$ {
}
location ~* \.(css)$ {
}
location ~* \.(html)$ {
}
location ~* \.(js)$ {
}
location / {
proxy_pass http://127.0.0.1:8080;
}
}