0

I have a website Flamingone.com. I registered my website on Google Webmaster some days ago. Today I checked what has Google Webmaster prepared about my website. Google said,

"Googlebot couldn't crawl your URL (/cgi-bin) (flamingone.com/cgi-bin), because your server either requires authentication to access the page, or it is blocking Googlebot from accessing your site.

But I neither have any folder named cgi-bin nor have a robots.txt. It should fire a 404 (My 404.php page). Why is it showing access denied?

1 Answers1

0

Try to create a robots.txt on your root folder with the following content:

robots.txt

User-agent: *
Disallow: /cgi-bin/
CMPS
  • 7,733
  • 4
  • 28
  • 53
  • Thanks for your reply. But until checking the above mentioned problem in Google Webmaster Tool, I had a robots.txt file with content exaclty what you have suggested. But in the Webmaster Tool, it was written that "If you don't want to ask search engine to crawl any link, you don't need a robots.txt. So I deleted it. – Radhamadhab Sarangi Jun 27 '14 at 04:31
  • @Mr.RMS robots can allow or disallow bots, so its existence does not matter but its content does – CMPS Jun 27 '14 at 06:02