-1

In the last few weeks Google has been reporting an error in the Search Console. More and more of my pages are not allowed to crawl - Coverage report says: Submitted URL blocked by robots.txt.

As you se, my robots.txt is ultra simple, why for about 20% of my pages this error occurs, I am lost about..

User-agent: *
Disallow: /cgi-bin/
Allow: /
Sitemap: https://www.theartstory.org/sitemapindex.xml
Host: https://www.theartstory.org

Examples Pages, which show an error:

https://www.theartstory.org/movement-fauvism-artworks.htm

https://www.theartstory.org/artist-hassam-childe-life-and-legacy.htm

rzo1
  • 5,561
  • 3
  • 25
  • 64

1 Answers1

0

Your robots.txt file is incorrectly configured. You should only need:

User-agent: *
Crawl-delay: 40
Disallow: /cgi-bin/

Sitemap: https://www.theartstory.org/sitemapindex.xml

Submit that for your robots.txt and try the crawl again.