What is the equivalent of Java\'s System.currentTimeMillis()
in C#?
I know question asks for equivalent but since I use those 2 for the same tasks I throw in GetTickCount. I might be nostalgic but System.currentTimeMillis() and GetTickCount() are the only ones I use for getting ticks.
[DllImport("kernel32.dll")]
static extern uint GetTickCount();
// call
uint ticks = GetTickCount();
Here is a simple way to approximate the Unix timestamp.
Using UTC is closer to the unix concept, and you need to covert from double
to long
.
TimeSpan ts = (DateTime.UtcNow - new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc));
long millis = (long)ts.TotalMilliseconds;
Console.WriteLine("millis={0}", millis);
prints:
millis=1226674125796
A common idiom in Java is to use the currentTimeMillis()
for timing or scheduling purposes, where you're not interested in the actual milliseconds since 1970, but instead calculate some relative value and compare later invocations of currentTimeMillis()
to that value.
If that's what you're looking for, the C# equivalent is Environment.TickCount
.
I just consider the most straight forward way how to achieve what you've been striving for as follows:
DateTime.Now.Ticks / TimeSpan.TicksPerMillisecond
the System.currentTimeMillis()
in java returns the current time in milliseconds from 1/1/1970
c# that would be
public static double GetCurrentMilli()
{
DateTime Jan1970 = new DateTime(1970, 1, 1, 0, 0,0,DateTimeKind.Utc);
TimeSpan javaSpan = DateTime.UtcNow - Jan1970;
return javaSpan.TotalMilliseconds;
}
edit: made it utc as suggested :)