I just changed the DNS settings so the folder /forum
is now a subdomain instead of a subdirectory. If I do a robots.txt file and say:
User-agent: *
Disallow: /forum
Will that disallow crawling for the subdirectory AND subdomain?
I want to disallow crawling of the subdirectory, but ALLOW crawling of the subdomain. Note: this is on shared hosting so both the subdirectory and subdomain can be visited. This is why I have this issue.
So, How can I only permit crawling for the subdomain?