5

I can't seem to properly configure nginx to return robots.txt content. Ideally, I don't need the file and just want to serve text content configured directly in nginx. Here's my config:

server {
    listen 80 default_server;
    listen [::]:80 default_server ipv6only=on;

    root /usr/share/nginx/html;
    index index.html index.htm;

    server_name localhost;

    location = /robots.txt {
        #return 200 "User-agent: *\nDisallow: /";
        #alias /media/ws/crx-apps/eap/welcome-content/robots.txt;
        alias /usr/share/nginx/html/dir/robots.txt ;
    }

    location / {
        try_files $uri $uri/ =404;
    }
}

None of the things in = /robots.txt location works and I don't get why. Accessing http://localhost/robots.txt gives a 404. However, http://localhost/index.html is served properly.

Note, that I didn't change any default settings of nginx obtained from apt-get install nginx apart from adding a new location (for testing).

Denys S.
  • 225
  • 1
  • 4
  • 12

1 Answers1

3

Firstly, i think the problem in your config is the regular expression used for matching. It would be very helpful to write a statement like this in your config in order to prevent possible mistakes with pattern matching:

    location = /robots.txt {
    alias /usr/share/nginx/html/dir/robots.txt;
} 

Secondly you should also check the permissions and effective owner of /usr/share/nginx/html/dir/robots.txt

Sir.pOpE
  • 380
  • 1
  • 10
  • Sorry, I've changed the pattern a while ago just didn't correct it. File exists and has root permissions, just as `/usr/share/nginx/html/index.html` which is served properly. – Denys S. Jul 15 '15 at 10:46