2

I am configuring apache2 on debian and would like to allow only robots.txt to be accessed for searching engines, while other .txt files are restricted, I tried to add the followings to .htaccess but no luck:

<Files robots.txt>
Order Allow,Deny
Allow from All
</Files>

<Files *.txt>
Order Deny,Allow
Deny from All
</Files>

Can anyone help or give me some hints? I am new comer to apache, thanks a lot.

user3162764
  • 37
  • 1
  • 5

1 Answers1

2

Use mod_rewrite

RewriteEngine On
RewriteCond %{REQUEST_URI} !/robots\.txt$ [nocase]
RewriteRule \.txt$  -  [forbidden,last]

First, make sure the rewrite engine in enabled.

Next, use a negated-match (!) to apply a conditional to the RewriteRule that excludes any URI's ending in "/robots.txt"

Lastly, if the URI ends in ".txt" then issue a 403 Forbidden.

EDIT: Don't forget the comparison engine is using regex, so you need to escape special characters (ie, .)

fukawi2
  • 5,396
  • 3
  • 32
  • 51