0

I just changed the DNS settings so the folder /forum is now a subdomain instead of a subdirectory. If I do a robots.txt file and say:

User-agent: *
Disallow: /forum

Will that disallow crawling for the subdirectory AND subdomain?

I want to disallow crawling of the subdirectory, but ALLOW crawling of the subdomain. Note: this is on shared hosting so both the subdirectory and subdomain can be visited. This is why I have this issue.

So, How can I only permit crawling for the subdomain?

Graham
  • 1,433
  • 3
  • 21
  • 34

1 Answers1

0

It's the correct way, if you want to stop crawling. But note: If the URLs are already indexed, the won't be removed.

The way I would prefer is to set all pages to "noindex/follow" via meta tags or even better you the "canonical tag" to send the search engines traffic to the subdomain url Into your

On a given URL like "http://www.yourdomain.com/directoryname/post-of-the-day" use

<link rel="canonical" href="http://directoyname.yourdomain.com/post-of-the-day" />

The latest URL will be the only one in SERPs

netzaffin
  • 1,602
  • 3
  • 13
  • 14