-1

Somebody messed up our robots.txt by accidentally adding \n after our entire allow: /products/ which are about 30.000 pages in total. The errors are on multiple language sites. This is one of our Search consoles.

I quickly noticed the error and deleted it. I've asked Google to verify my solution but about 3 months later the errors are still increasing. See the images below:

Screenshot of Search Console

Our robots.txt

Is there anything I can do to speed up the proces? I have started the validation.

unor
  • 92,415
  • 26
  • 211
  • 360
Kevin Tad
  • 79
  • 2
  • 9

1 Answers1

1

Your robots.txt is unreachable. It is redirected by 301 to https://www.unisgroup.nl/robots.txt/, which is directory, not a file - do you see the trailing slash? Google is looking for a file, doesn't find it and goes away.

Evgeniy
  • 2,337
  • 2
  • 28
  • 68