I am using System.Time.Instant to determine how quickly a web address can respond to a simple request. My overall goal is to determine latency between my home computer and a service that I set up on a remote machine that is 200 miles away. Here's a simplified version of the code, which otherwise runs fine:
...//instantiated all necessary variables here....
Instant starts = Instant.now();
...
//Get information from web address
url = new URL(webAddress);
inputstream = url.openStream();
br = new BufferedReader(new InputStreamReader(inputstream));
//Wait for website to respond
while ((line = br.readLine()) != null)
{
instanceOutput += line;
}
Instant ends = Instant.now();
...
//Output the duration
return ("time required to get result: " + Duration.between(starts, ends).toMillis());
This tends to produce results of 0 milliseconds, or 14-16 milliseconds, or occasionally longer periods that are roughly a multiple of 15 milliseconds. But, it's physically impossible for the latency to be 0 milliseconds, even if rounded down. The machines are 200 miles apart, so the signal must take at least 2 to 3 milliseconds each way, otherwise it would exceed the speed of light.
The only similar problem that I've been able to find is this one from way back in 2009, and the second answer suggests a reason (but for a different method), and the documentation link is broken: Timer accuracy in java, Also, it seems that Java may have updated the timekeeping since then. And, the claim that the system timer isn't updated regularly seems extraordinary and I can't find it repeated anywhere else.
Is this a good enough way to calculate duration to the nearest millisecond? From what I've recently read the nanosecond timer would be more accurate, but I don't need that amount of precision, and it would be bad form for me to fix a problem without trying to understand why the problem happened and learning from it. What am I doing wrong or misunderstanding?