-1

I am getting this error on the first line of robot.txt User-agent: *

my robot.txt is as follows:

User-agent: *
Disallow: /Search/
Disallow: /_layouts/ 
Disallow: /blog/_layouts/
Disallow: /Blog/_layouts/
Disallow: /ReusableContent/
Disallow: /Reports%20List/
Disallow: /WorkflowTasks/
Disallow: /SiteCollectionImages/
Disallow: /Documents/Forms/
Disallow: /Pages/Forms/
Disallow: /Internet/

I have figured there is a space /Reports%20List/. is this creating issues? I am not very much sure. Any help would be appreciated.

Preetam
  • 618
  • 6
  • 13
  • "getting this error" - from where? Should it be `User-Agent` with a capital `A`? – Rup May 21 '13 at 10:08
  • SEO people in my company are getting this error. and I have looked around. `A` need not to be capital. – Preetam May 21 '13 at 10:11
  • 3
    So you can't reproduce this error yourself? You should ask them for whatever you need to reproduce it. – Rup May 21 '13 at 10:15
  • Have you tried `Reports List` ? – HamZa May 21 '13 at 11:13
  • Is it possible that the file is not a plain text file? If it's a MS Word document or HTML, the robots.txt parser won't understand it. Easy way to determine if the `%20` is causing a problem: remove it and see if the file works then. – Jim Mischel May 21 '13 at 19:51

1 Answers1

0

The trouble here is with the 3 invisible bytes at the beginning of the UTF-8 formatted robots.txt file, the so called BOM (byte order mask). This BOM is optional. Yes, you need the file UTF-8 formatted, but Google currently does not like the optional BOM in the expected clear and simple robots.txt file and parsing fails.

You can read a more detailed explanation HERE.