-1

I want to allow access to a single crawler to my website - the Googlebot one. In addition, I want Googlebot to crawl and index my site according to the sitemap only.

Is this the right code?

I know that Only "good" bots follow the robots.txt instructions but still..it's a start point

User-agent: Googlebot
Allow: /
User-agent: *
Disallow: /

Sitemap: https://example.com/sitemap.xml

chenrui
  • 8,910
  • 3
  • 33
  • 43
dan
  • 1
  • 3
  • This question appears to be off-topic because it is about **SEO** which is off-topic at Stack Overflow. Please read ["Which SEO questions should be closed as non-programming/non-admin?"](//meta.stackoverflow.com/a/382618) to better understand when SEO questions are acceptable to ask here (most are not) and where you might be able to get assistance. – John Conde Jun 10 '20 at 13:41

1 Answers1

0

This is what you want for the robots.txt (see more examples in this doc)

User-agent: Googlebot
Allow: /

User-agent: *
Disallow: /
chenrui
  • 8,910
  • 3
  • 33
  • 43