I have searched the postings for an answer related to the time which the System.nanoTime( ) method call takes to process.
Consider the following code:
long lastTime = System.nanoTime();
long currentTime = System.nanoTime();
long deltaTime = currentTime - lastTime;
If you run this, currentTime - lastTime will evaluate to '0'. The only way for this to happen is if the computer processed that second method call outside of the resolution of a nanosecond (i.e. the call took less than a nanosecond). Logically this makes sense, because a computer can (on average) perform multiple processes in a single nanosecond.
Is this correct? If not, where is my logic wrong?