1

Do anyone knows how to stop Automated Scanners from Scanning the Web Application or website ? Leaving the robots.txt Any other way Configuration can be made? Any server side modification?

Rohit Sharma
  • 21
  • 1
  • 3

1 Answers1

0
  1. You can add a /robots.txt file that can ask scanners nicely not to scan all or parts of your site. Most legitimate search engine robots follow the instructions you put in robots.txt

For example:

User-agent: *
Disallow: /api/
  1. If you want to be more fancy pants, you can add a http get on /api/ that normal browsers won't access. Your application HTML may access /api/customers and /api/suppliers but never /api/ for example.

When an automated scanner reads /api/, you can reject any further requests from that IP for 10 minutes. It may help a little.

  1. A Web Application Firewall like OWASP ModSecurity may also help
rjdkolb
  • 10,377
  • 11
  • 69
  • 89