I have this robots.txt:
User-Agent: *
Disallow: /files/
User-Agent: ia_archiver
Allow: /
User-agent: Googlebot
Disallow:
User-agent: googlebot-image
Disallow:
User-agent: googlebot-mobile
Disallow:
I am finding that PDF files in the /files/ directory are being indexed by Google.
Should I move the first entry to the bottom?
In working with Google's webmaster tools. I moved the /files/ disallow to the bottom and ran a test on one PDF file in the files directory and it returned Success.
How can I fix this issue? We do not want anything in this directory being indexed.
EDITED
Even if I remove everything except the first clause,
User-Agent: *
Disallow: /files/
Google still is able to see PDFs in the /files/ directory, what am I doing wrong here?
In Bing's webmaster tools, it shows as blocked but Google's still shows Success.