I am using Tomcat 5.5 server and there is a web application deployed. I want to block http requests which access the .txt files in my project. For example http urls like -- https ://MyDomain/inside/mytest.txt
I think this can be done using security-constraints in web.xml file or writing a custom valve. However, I am exploring the possibility of using robots.txt file as they seem very simple. So I have written a robots.txt file to block access to *.txt file as follows --
# go away User-agent: * Disallow: /*.txt
I have put it in the ROOT folder and also in all paths inside the webapps folder. However, it does-nt seem to have any effect and I am still able to access the *.txt files. Are there any other caveats and steps required for the robots.txt file to take affect in Tomcat? Any help here highly appreciated.