1

We have a dedicated development server which runs only test PHP applications on a public network.
We have setup session-based authentication for the site.

The issue we have is there are lots of 404s logged in access log for robots.txt.
So, We want to block/ignore these requests on the lighttpd to save some bandwidth.

How can we achieve this in lighttpd/1.4.31 ?

Vishnu Kumar
  • 131
  • 5
  • 2
    wouldn't it be better to put a dummy robots.txt file? – Waleed Hamra Nov 19 '12 at 14:20
  • @WaleedHamra, But isnt putting a dummy robots.txt file also a part of bandwidth. The applications on the server takes around 150K requests per day. So, i suppose serving a dummy file will use 150K*1K of bandwidth. Am I Right? – Vishnu Kumar Nov 19 '12 at 14:25
  • 4
    a 1 byte file will generate 150K bytes of data... much smaller than a complete 404 page, if your server serves one. – Waleed Hamra Nov 19 '12 at 14:40
  • Why are search engines crawling your development server in the first place?!? – Michael Hampton Nov 19 '12 at 16:40
  • WaleedHamra,oh ok, didnt think of it that way. Thanks for pointing. MichaelHampton, no idea, i have a domain via which all the applications are accessed, could that be the reason? – Vishnu Kumar Nov 20 '12 at 11:29

0 Answers0