0

I'd like to put robots.txt for the subdomain dev. to point to a specific robots.txt static file.

What routing rule do I need?

This is what i'm thinking:

if ($host = "dev.example.com") {
    location ^~ /robots.txt {
       allow all;
       root  /static/disallow/robots.txt;
    }
}

Based on Nginx subdomain configuration I believe I may need to just make separate server blocks and use include's. If not a simple routing rule, is the includes method how this is typically done?

Kevin Danikowski
  • 4,620
  • 6
  • 41
  • 75

1 Answers1

1

If you are wanting a specific robots.txt for each subdomain, you can do so with separate server blocks like this, which you allude (and link) to in your question:

server {
    server_name subdomainA.domain.com;
    include /common/config/path;
    location = /robots.txt {
        root /subdomainA/path;
    }
}
server {
    server_name subdomainB.domain.com;
    include /common/config/path;
    location = /robots.txt {
        root /subdomainB/path;
    }
}

Regarding your other approach, have you read If is Evil?

Jason Rahm
  • 670
  • 3
  • 14