I don't know so much in hash algorithms.
I need to compute the hash of an incoming file live in Java before forwarding the file a remote system (a bit like S3) which requires a file hash in MD2/MD5/SHA-X. This hash is not computed for security reasons but simply for a consistency checksum.
I am able to compute this hash live while forwarding the file, with a DigestInputStream of Java standard library, but would like to know which algorithm is the best to use to avoid performance problems of using the DigestInputStream?
One of my former collegue tested and told us that computing the hash live can be quite expensive compared to an unix command line or on a file.
Edit about premature optimization: I work an a company which targets to help other companies to dematerialize their documents. This means we have a batch which handle document transfers from other companies. We target in the future millions of document per days and actually, the execution time of this batch is sensitive for our business.
An hashing optimisation of 10 milliseconds for 1 million document per day is a daily execution time reduced of 3 hours which is pretty huge.