How can one measure time for a large block of code except when stopped at a breakpoint?

◇◆丶佛笑我妖孽 提交于 2019-12-22 20:06:10

问题


I am working on a C++ game / simulation / graphics application on Windows.
(edit start) If it matters, I am using Visual Studio 2013. (edit end)

Setup: I am measuring the time from one frame to the next using QueryPerformanceCounter().

Steps to Reproduce: Stop at a breakpoint for some debugging.

Current Undesired Result: When application resumes execution, the next time QueryPerformanceCounter() is called, it will return a time value that includes all of the time was spent paused at the debugger. This results in a frame time length that is abnormally large.

Desired Result: QueryPerformanceCounter() will return a time value that does not include however much time I spent paused at a breakpoint for debugging.

How can I achieve the desired result?
1. Is there a different function that I should use instead of QueryPerformanceCounter?
2. Is there some setting in an IDE (i.e. Visual Studio) that can be changed?
3. Is there some special way to use QueryPerformanceCounter or other time function?
4. etc.

Reference Notes: There is a somewhat similar question listed here, but it was not helpful for me to identify a solution. Tracking Time Spent in Debugger


回答1:


I am not sure that I understand what you're trying to do.

  • Duration measuring in debug mode is useless. The differences between Debug and Release are dramatic. The ratio between different execution times of different functions is not necessarily the same in Debug and Release.
  • If you need to fix bottlenecks, use a profiler (in VC go to Debug | Profiler).

If you still want to measure the time in Debug mode, here's an idea:

#include <iostream>
#include <chrono>
#include <Windows.h>

struct duration
{
  std::chrono::high_resolution_clock::time_point _s;

  duration() : _s( std::chrono::high_resolution_clock::now() )
  {
    // nop
  }

  auto ms() 
  { 
    return std::chrono::duration<double, std::milli>( std::chrono::high_resolution_clock::now() - _s ).count(); 
  }
};

struct breakpoint
{
  std::chrono::high_resolution_clock::time_point _s;
  duration& _d;

  breakpoint( duration& a_d ) : _s( std::chrono::high_resolution_clock::now() ), _d( a_d )
  { 
    __asm {int 3} 
  }

  ~breakpoint() 
  { 
    _d._s += std::chrono::high_resolution_clock::now() - _s;
  }
};

int main()
{
  duration d;

  Sleep( 1000 );
  std::cout << d.ms() << std::endl;

  breakpoint { d };

  Sleep( 1000 );
  std::cout << d.ms() << std::endl;

  return 0;
}



回答2:


Use a commercial profiling tool. Measuring how long it took to execute a function or method is exactly what they are made for.

Also, why would you care how long something took while you were debugging? Are you planning on releasing to your customers in debug mode, and crawling through line by line remotely while they play? If you want to examine frame time, examine it in release with optimizations turned on.



来源:https://stackoverflow.com/questions/36370915/how-can-one-measure-time-for-a-large-block-of-code-except-when-stopped-at-a-brea

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!