In the last few weeks Google has been reporting an error in the Search Console. More and more of my pages are not allowed to crawl - Coverage report says: Submitted URL blocked by robots.txt.
As you se, my robots.txt is ultra simple, why for about 20% of my pages this error occurs, I am lost about..
User-agent: *
Disallow: /cgi-bin/
Allow: /
Sitemap: https://www.theartstory.org/sitemapindex.xml
Host: https://www.theartstory.org
Examples Pages, which show an error:
https://www.theartstory.org/movement-fauvism-artworks.htm
https://www.theartstory.org/artist-hassam-childe-life-and-legacy.htm