The documentation for System.nanoTime() says the following (emphasis mine).
This method can only be used to measure elapsed time and is not related to
One quite interesting feature of the difference between System.currentTimeMillis()
& System.nanoTime()
is that System.nanoTime()
does NOT change with the wall clock. I run code on a Windows virtual machine that has heavy time drift. System.currentTimeMillis()
can jump back or forward by 1-2 seconds each time as NTP corrects that drift, making accurate time stamps meaningless. (Windows 2003, 2008 VPS editions)
System.nanoTime()
is not, however, affected by changing wall clock time so you can take a time retrieved over NTP and apply a correction based on System.nanoTime()
since NTP was checked last and you have a far more accurate time than System.currentTimeMillis()
in adverse wall clock conditions
This is of course counter-intuitive, but useful to know
In Clojure command line, I get:
user=> (- (System/nanoTime) (System/nanoTime))
0
user=> (- (System/nanoTime) (System/nanoTime))
0
user=> (- (System/nanoTime) (System/nanoTime))
-641
user=> (- (System/nanoTime) (System/nanoTime))
0
user=> (- (System/nanoTime) (System/nanoTime))
-642
user=> (- (System/nanoTime) (System/nanoTime))
-641
user=> (- (System/nanoTime) (System/nanoTime))
-641
So essentially, nanoTime
doesn't get updated every nanosecond, contrary to what one might intuitively expect from its precision. In Windows systems, it's using the QueryPerformanceCounter
API under the hood (according to this article), which in practice seems to give about 640 ns resolution (in my system!).
Note that nanoTime
can't, by itself, have any accuracy at all, since its absolute value is arbitrary. Only the difference between successive nanoTime
calls is meaningful. The (in)accuracy of that difference is in the ballpark of 1 microsecond.
The first interpretation is correct. On most systems the three least-significant digits will always be zero. This in effect gives microsecond accuracy, but reports it at the fixed precision level of a nanosecond.
In fact, now that I look at it again, your second interpretation is also a valid description of what is going on, maybe even more so. Imagining freezed time, the report will be always the same wrong number of nanoseconds, but correct if understood as the integer number of microseconds.
If someone like me comes and reads this question again and again and again to still kind of understand it, here is a simpler (I hope) explanation.
Precision
is about how many digits you retain. Each of the:
long start = System.nanoTime();
long end = System.nanoTime();
is going to be a precise number (lots of digits).
Since accuracy
is measured only compared to something, an individual call to System.nanoTime
makes no sense since it's value is quite arbitrary and does not depend on something that we can measure. The only way to distinguish it's accuracy is to two different calls of it, thus:
long howMuch = end - start;
is not going to have a nano-second accuracy. And in fact on my machine the difference is 0.2 - 0.3 micro-seconds.