I host a few git repositories at git.nomeata.de using gitweb (and gitolite). Occasionally, a search engine spider comes along and begins to hammer the interface. While I generally do want my git repositories to show up in search engines, I do not want to block them completely. But they should not invoke expensive operations such as snapshotting the archive, searching or generating diffs.
What is the “best” robots.txt
file for such an installation?