-1

I have submitted a sitemap which has many thousands of URLs, but when I look at the webmaster tools it claims that 9800 of my URLS are blocked by my robots.txt file.

What am I supposed to do to convince it that nothing is being blocked?

Blocking my URL

Blank Robots.txt

weexpectedTHIS
  • 3,358
  • 1
  • 25
  • 30
  • The first picture shows you that the links are indeed being blocked. The second shows you that the robots.txt is indeed blank. What are you talking about? – weexpectedTHIS Sep 25 '14 at 22:36

1 Answers1

0

Sometimes this just means that the robots.txt file couldn't be reached (returned a 5xx server error, or Googlebot just didn't get a response). In those cases, Google will treat any URL they attempt to crawl as being disallowed by robots.txt. You can see that in the Crawl Errors section in Webmaster Tools (in the site-errors on top).

John Mueller
  • 1,444
  • 9
  • 8