I have dev.example.com and www.example.com hosted on different subdomains. I want crawlers to drop all records of the dev
subdomain but keep them on www
. I am using git to store the code for both, so ideally I'd like both sites to use the same robots.txt file.
Is it possible to use one robots.txt file and have it exclude crawlers from the dev
subdomain?