I have a free site that utilizes Byethost, which utilizes iFastnet hosting services. For a little while now my site has become "de-indexed" by google, because it cannot fetch any aspect of my site now for some reason. I can access my site without issues in any web browser and I have not changed any code for the site. My robots.txt is even set to
User-agent: *
Allow: /
Strangely enough the google structured data tool is able to fetch any page, but every other aspect of the google crawler fails to fetch resources from the site.
For example, when testing robots.txt through google, it says
You have a robots.txt file that we are currently unable to fetch. In such cases we stop crawling your site until we get hold of a robots.txt, or fall back to the last known good robots.txt file. Learn more.
So clearly google is getting blocked somehow before it can even reach the server and download the robots.txt resource.
When I cURL the site from my computer, it also hits me with a 403 Error. But when I try to cURL a copy of the site on a private server, it returns the page without issues. Surely this must be related to Google's issue.
But I have no .htaccess or any security features like IP blacklisting enabled, so this is super confusing.
My free plan with byethost should also not have cloudflare as it is a premium feature.
Can anyone provide some insight as to why I am getting these 403 errors? Is this the same reason why Google cannot access the site anymore? How can I fix this?
Thank you