I want to allow access to a single crawler to my website - the Googlebot one. In addition, I want Googlebot to crawl and index my site according to the sitemap only.
Is this the right code?
I know that Only "good" bots follow the robots.txt instructions but still..it's a start point
User-agent: Googlebot
Allow: /
User-agent: *
Disallow: /
Sitemap: https://example.com/sitemap.xml