I'm using Aegir/Barracuda/Nginx to maintain a multisite setup. My "files" directory is symlinked to a mounted "files" directory. Therefore when I clone a site to be used for dev purposes it uses the same "files" directory. The problem with the current practice of using sites/mydomain/files as the location for the robots.txt is that I am unable to put custom directions in my new cloned development site to stop the crawlers form indexing and thus getting penalized for duplicate content. Is there a workaround option for me?
My files directory pretty much has to be symlinked because it holds a LOT of media files and it wouldn't make sense to recreate the entire "files" directory every time I clone a site.