My CMS is placed in a subfolder, so via .htaccess I forward everything. Good for the cms and that following snippet works without any problems, but bad for files like robots.txt, which have to be stored in the web root (e. g. https://domain.xyz/robots.txt
). If I call that URL the browser and the crawlers will be (of course) forwarded to https://domain.xyz/TEST
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{HTTPS} !=on
RewriteRule ^ https://domain.xyz%{REQUEST_URI} [L,R=301]
RewriteCond %{HTTP_HOST} !^domain\.xyz$ [NC]
RewriteRule ^ https://domain.xyz/TEST [L,R=301]
RewriteCond %{REQUEST_URI} !^/TEST
RewriteRule ^ https://domain.xyz/TEST [L,R=301]
</IfModule>
So I have to skip that file(s) and I would add
RewriteCond %{THE_REQUEST} !/(robots\.txt|sitemap\.xml)\s [NC]
for the files robots.txt and sitemap.xml before the RewriteRule
, but it doesn't work. What's wrong? Could somebody please help me with that? Thank you.