Is there any way to write rules for robots.txt or htaccess that will block all bots that come from a .ru domain? Thanks
Asked
Active
Viewed 1,249 times
-1
-
3What have you tried? What research have you done into how one might accomplish this? – EEAA Aug 12 '15 at 18:23
-
Take a look at `mod_evasive`. You can configure apache to block a client if it attempts too many requests within a given time frame. https://www.linode.com/docs/websites/apache-tips-and-tricks/modevasive-on-apache – Gene Aug 12 '15 at 22:10
1 Answers
2
Yes you can with htaccess and some regex. A quick bit of googling shows up many examples, you could have done that yourself before asking here...
http://httpd.apache.org/docs/1.3/howto/auth.html#allowdeny
For example:
Order deny,allow
Deny from .ru
Allow from all
However this is reliant on a genuine referrer and therefore can be bypassed. For a more robust method of blocking countries or TLD's you should implement a separate security appliance.

Gene
- 3,663
- 20
- 39

tomstephens89
- 1,011
- 1
- 12
- 24
-
So adding something like this:
deny from .ru
will block all .ru user agents? – bob Aug 12 '15 at 18:42 -
1@bob, you have to specify the Order then specify the Deny and Allow rules. However, this is very likely a bad idea. This is going to require a lot of reverse DNS lookups, and if your system sees a lot of traffic it has the potential for slowing things down greatly. – Gene Aug 12 '15 at 19:45
-
Thanks for that info, I never would have guessed it. I'll leave it as is now. – bob Aug 12 '15 at 19:56