5

I have a site that uses wildcard subdomains so that when somebody signs up they get there own subdomain.

I do no want google (or other serach engines) to crawl and index any of the subdomains (accept for www.)

Is there away to do this with robots.txt?

Regards

pjknight
  • 130
  • 3
  • 15

1 Answers1

4

Im guessing no, at least not directly with one global robots.txt file. See: http://www.seomoz.org/q/block-an-entire-subdomain-with-robots-txt

Somewhere on that page andykuiper wrote:

you can block an entire subdomain via robots.txt, however you'll need to create a robots.txt file and place it in the root of the subdomain, then add the code to direct the bots to stay away from the entire subdomain's content.

User-agent: * Disallow: /

See also:

Make a script that creates/copies the robots.txt file to a newly created subdomain and everything should work as intended.

Community
  • 1
  • 1
ZZ-bb
  • 2,157
  • 1
  • 24
  • 33