I'm trying to compute checksum of a transferred file. Traditional way is to receive and write the file to disk and then read again from disk and compute checksum. Alternatively, I can write and read simultaneously to optimize the process. I observed that if I write and read concurrently it finishes faster since read operations are not going to disk as a results of increased cache hits. However, I am worried whether or not my checksum calculation is still reliable since I think one of the reason for checksum calculation is to detect disk write errors? If so, would concurrently writing and reading be missing disk write errors?
FileOutputStream fos = new FileOutputStream("testwrite.jpg");
InputStream is = Files.newInputStream(Paths.get("testwrite.jpg"));
MessageDigest md = null;
try {
md = MessageDigest.getInstance("MD5");
} catch (NoSuchAlgorithmException e) {
e.printStackTrace();
}
DigestInputStream dis = new DigestInputStream(is, md);
byte[] bufferWrite = new byte[4096];
byte[] bufferRead = new byte[4096];
long current = 0L;
long startTime = System.currentTimeMillis();
while (current < totalWriteSize) {
fos.write(bufferWrite, 0, 4096);
fos.flush();
dis.read(bufferRead);
current += 4096;
}
fos.close();