I'd like to block all bots from crawling a sub directory http://www.mysite.com/admin
plus any files and folders in that directory. For example there may be further directories inside /admin
such as http://www.mysite.com/admin/assets/img
I'm not sure what is the exact correct declarations to include in robots.txt to do this.
Should it be:
User-agent: *
Disallow: /admin/
Or:
User-agent: *
Disallow: /admin/*
Or:
User-agent: *
Disallow: /admin/
Disallow: /admin/*