I am running Nginx 1.1.19 on an Ubuntu server 12.04 and I'm having trouble doing the Googlebot, see the robots.txt file. I used the examples this post, but I did not get success. To test the service, I access the Webmaster Tools, click on "Integrity > Search as Googlebot"... just that I'm getting messages from "Not Found", "Page not available" and "robots.txt file is not accessible"....
I would also confirm if the configuration should be performed on the file nginx.conf
or file "default" in /etc/nginx/sites-enabled
, because in later versions, I noticed that might be different.
This is my basic settings.
root /usr/share/nginx/www;
index index.php;
# Reescreve as URLs.
location / {
try_files $uri $uri/ /index.php;
}