I am using SEMrush for SEO purposes. I am unsure why I am receiving the following error on their site for crawling my page:
We couldn't crawl this page using the SEMrushBot user agent due to an HTTP error 406. Nevertheless, we were able to collect a few general ideas for this page. Please ensure that your page can be accessed by search engine crawlers, and then start optimizing it using our ideas.
What could be causing a 406 error to prevent bots from crawling the page? I believe that Google is able to crawl it because I don't have any errors on their side.
What can I check?
Robots.txt file:
User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /junk/
Disallow: /admin/
Disallow: /old-site/
sitemap: https://example.com/sitemap.xml