问题
I'm speaking from the .NET point of view but this could extend to other languages or frameworks that use similar logic.
Is it correct to assume that when sorting objects by a DateTime property, the DateTime value is converted to Ticks (i.e., long integers) for comparison purposes? And as a result, the speed of sorting by DateTime
is not much, if any, slower than sorting by integers?
回答1:
Yes, it compares ticks. Here is actual implementation:
public int CompareTo(DateTime value) {
long valueTicks = value.InternalTicks;
long ticks = InternalTicks;
if (ticks > valueTicks) return 1;
if (ticks < valueTicks) return -1;
return 0;
}
回答2:
You can use TicksPer constants
Here is the link
Module Module1
Sub Main()
' Display these constants.
Console.WriteLine(TimeSpan.TicksPerDay)
Console.WriteLine(TimeSpan.TicksPerHour)
Console.WriteLine(TimeSpan.TicksPerMinute)
Console.WriteLine(TimeSpan.TicksPerSecond)
Console.WriteLine(TimeSpan.TicksPerMillisecond)
End Sub
End Module
Output
864000000000
36000000000
600000000
10000000
10000
来源:https://stackoverflow.com/questions/16370279/sort-by-integer-vs-by-datetime