We have an Umbraco website which has several sub-domains and we want to exclude one of them from being crawled in search engines for now. I tried to change my Robots.txt file but seems I am not doing it right.
subdomain: http://sub1.mywebsite.co.dl/
My Robots.txt content is as follow:
User-agent: *
Disallow: sub1.*
What I have missed?