How can I block specific URLs by using a robots.txt? We do not want Google to crawl our site. How can I define a disallow tag for those URLs in a robots.txt file?
Asked
Active
Viewed 68 times
1 Answers
0
To exclude all robots from part of the server you can use this if you want to exclude, for example, these 3 folders:
User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /junk/

Eggs
- 345
- 4
- 16