问题
Following Eric Lippert's post years back on precision of DateTime
, I ran his test on .netcore and .NET Framework 4.5.2, on the same machine with Windows 10.
var n = 1000;
int i = 0;
long[] diffs = new long[n];
while (i++ < n-1)
{
if (ticks != DateTime.Now.Ticks)
{
var newTicks = DateTime.UtcNow.Ticks;
var diff = newTicks - ticks;
diffs[i] = diff;
ticks = newTicks;
}
}
foreach (var d in diffs)
{
if (d == 0)
Console.WriteLine("same");
else
Console.WriteLine(d);
}
The result on .NET framework 4.5.2 was as expected: some random "same" in output which means DateTime
is not precise to some sub-levels.
However, the result on .NET core was totally different: no "same" in output. Not two Ticks
had the same value.
What would be the explanation?
回答1:
The explanation would be, that dot net asks the underlying operating system for the current time. The operating system asks the underlying hardware. In ancient times the hardware clock (RTC) on the motherboard used to update itself once in about 15 milli seconds. That number was derived from the 60Hz AC line frequency in US which the power grid maintained sufficient accurately. Remember those were the days of "slow" computers and designers tried to squeeze in every bit of performance they could. So the OS did not consult RTC everytime someone asked for time and passed a cached copy of the value - which is updated very infrequently.
Somewhere down the line, the motherboard evolved and the RTCs became more precise. But the OS and all the things on top of it did not feel the need for it. Remember h/w evolved far faster than software and even till day, consumer grade software waste a large fraction of raw h/w capability. So when dot net framework asked OS for time, it got back the imprecise data even when the h/w was capable. The accuracy did evolve from 15ms to below 1ms, but that was it.
Come windows 8 (server 2012), it was finally realized that (1) applications could do better with more precise time (2) computers are fast so consulting RTC everytime is no longer a problem (3) a large population of programmers and programs are used to and actually rely on the imprecise time behavior. So they (win 8) went on to introduce a new marginally slower mechanism to obtain the most precise time data, but left the original implementation unchanged.
Dot net had always used the older and imprecise OS function GetSystemTimeAsFileTime
and when a new cousin GetSystemTimePreciseAsFileTime
appeared in win 8, dot net chose to go the backward compatible way and did nothing.
Dot net core is a fresh rewrite of many core features and now leverages the high precision data source.
Edit
If the current time is 13:14:15:123456 , there is still no guarantee that the real true time, as seen by the physicists and astronomers is that. Your computer is not an atomic clock. And certainly not a well synchronized clock. The only thing it means is that if two events happened at different timestamps then one event happened certainly before the another. In older computers, rate of generation of events (ex logs, files, database txns etc) was lower and so there was low chance that sequential events would be assigned same timestamps. This new time system caters to modern high rate activities so that you can mark sequential events as different. Still for two very close events, there will always be a chance of same timestamp. That is eventually unavoidable. If you need nanosecond level measurement (why) you need different tools like Stopwatch
and not System.DateTime
.
来源:https://stackoverflow.com/questions/53688977/datetime-precision-in-net-core