I want to allow crawling of files in:
/directory/
but not crawling of files in:
/directory/subdirectory/
Is the correct robots.txt instruction:
User-agent: *
Disallow: /subdirectory/
I'm afraid that if I disallowed /directory/subdirectory/ that I would be disallowing crawling of all files in /directory/ which I do not want to do, so am I correct in using:
User-agent: *
Disallow: /subdirectory/