问题
Let me ask my question by this test program:
#include <iostream>
#include <chrono>
using std::chrono::nanoseconds;
using std::chrono::duration_cast;
int main(int argc, char* argv[])
{
std::cout << \"resolution (nano) = \" << (double) std::chrono::high_resolution_clock::period::num
/ std::chrono::high_resolution_clock::period::den * 1000 * 1000 * 1000 << std::endl;
auto t1 = std::chrono::high_resolution_clock::now();
std::cout << \"how much nanoseconds std::cout takes?\" << std::endl;
auto t2 = std::chrono::high_resolution_clock::now();
auto diff = t2-t1;
nanoseconds ns = duration_cast<nanoseconds>(diff);
std::cout << \"std::cout takes \" << ns.count() << \" nanoseconds\" << std::endl;
return 0;
}
Output on my machine:
resolution (nano) = 100
how much nanoseconds std::cout takes?
std::cout takes 1000200 nanoseconds
I receive either 1000200
or 1000300
or 1000400
or 1000500
or 1000600
or 2000600
as a result (= 1 or 2 microsecond). Obviously either the resolution of std::chrono
is not 100 nano-seconds or the way I measure the time of std::cout
is wrong. (why I never receive something between 1 and 2 microseconds, for example 1500000
?)
I need a high-resolution timer in C++. The OS itself provides a high-resolution timer because I\'m able to measure things with microsecond-precision using C# Stopwatch
class on the same machine. So I would just need to correctly use the high-resolution timer that the OS has!
How do I fix my program to produce the expected results?
回答1:
I'm going to guess you are using VS2012; If not, disregard this answer. VS2012 typedef
's high_resolution_clock
to system_clock
. Sadly, this means it has crappy precision (around 1ms). I wrote a better high res clock which uses QueryPerformanceCounter
for use in VS2012...
HighResClock.h:
struct HighResClock
{
typedef long long rep;
typedef std::nano period;
typedef std::chrono::duration<rep, period> duration;
typedef std::chrono::time_point<HighResClock> time_point;
static const bool is_steady = true;
static time_point now();
};
HighResClock.cpp:
namespace
{
const long long g_Frequency = []() -> long long
{
LARGE_INTEGER frequency;
QueryPerformanceFrequency(&frequency);
return frequency.QuadPart;
}();
}
HighResClock::time_point HighResClock::now()
{
LARGE_INTEGER count;
QueryPerformanceCounter(&count);
return time_point(duration(count.QuadPart * static_cast<rep>(period::den) / g_Frequency));
}
(I left out an assert and #ifs to see if it's being compiled on 2012 from the above code)
You can use this clock anywhere and in the same way as standard clocks.
回答2:
The resolution of a clock is not necessarily the same as the smallest duration that can be represented by the data type the clock uses. In this case your implementation uses a data type which can represent a duration as small as 100 nanoseconds, but the underlying clock doesn't actually have such a resolution.
The low resolution of Visual Studio's high_resolution_clock
has been an issue for several years. Microsoft's C++ standard library maintainer, Stephan T. Lavavej, has indicated that this has been fixed in VS2015 via the use of QueryPerformanceCounter()
.
回答3:
Maybe the implementation doesn't implement the higher resolution timer?
It seems you are using Windows (you mention C#) so if you a timer and you are indeed using windows you can use QueryPerformanceFrequency and QueryPerformanceCounter.
来源:https://stackoverflow.com/questions/16299029/resolution-of-stdchronohigh-resolution-clock-doesnt-correspond-to-measureme