#include
#include
using namespace std;
int main()
{
vector< vector > dp(50000, vector(4, -1));
c
Running a program with the debugger attached is always slower than without.
This must be caused by VS hooking into the new/delete calls and doing more checking when attached - or the runtime library uses IsDebuggerPresent API and does things different in that case.
You can easily try this from inside Visual Studio, start the program with Debug->Start Debugging or Debug->Start Without Debugging. Without debugging is like from command line, with exactly the same build configuration and executable.
Yeah, WTF indeed.
You know your compiler will optimize a lot of those function calls by inlining them, and then further optimize the code there to exclude anything that isn't actually doing anything, which in the case of vectors of int will mean: pretty much not a lot.
In debug mode, inlining is not turned on because that would make debugging awful.
This is a nice example of how fast C++ code can really be.
It's definitely HeapFree that's slowing this down, you can get the same effect with the program below.
Passing parameters like HEAP_NO_SERIALIZE to HeapFree doesn't help either.
#include "stdafx.h"
#include <iostream>
#include <windows.h>
using namespace std;
int _tmain(int argc, _TCHAR* argv[])
{
HANDLE heap = HeapCreate(0, 0, 0);
void** pointers = new void*[50000];
int i = 0;
for (i = 0; i < 50000; ++i)
{
pointers[i] = HeapAlloc(heap, 0, 4 * sizeof(int));
}
cout << i;
for (i = 49999; i >= 0; --i)
{
HeapFree(heap, 0, pointers[i]);
}
cout << "!";
delete [] pointers;
HeapDestroy(heap);
}
Running in the debugger changes the memory allocation library used to one that does a lot more checking. A program that does nothing but memory allocation and de-allocation is going to suffer much more than a "normal" program.
Edit Having just tried running your program under VS I get a call stack that looks like
ntdll.dll!_RtlpValidateHeapEntry@12() + 0x117 bytes
ntdll.dll!_RtlDebugFreeHeap@12() + 0x97 bytes
ntdll.dll!_RtlFreeHeapSlowly@12() + 0x228bf bytes
ntdll.dll!_RtlFreeHeap@12() + 0x17646 bytes
msvcr90d.dll!_free_base(void * pBlock=0x0061f6e8) Line 109 + 0x13 bytes
msvcr90d.dll!_free_dbg_nolock(void * pUserData=0x0061f708, int nBlockUse=1)
msvcr90d.dll!_free_dbg(void * pUserData=0x0061f708, int nBlockUse=1)
msvcr90d.dll!operator delete(void * pUserData=0x0061f708)
desc.exe!std::allocator<int>::deallocate(int * _Ptr=0x0061f708, unsigned int __formal=4)
desc.exe!std::vector<int,std::allocator<int> >::_Tidy() Line 1134 C++
Which shows the debug functions in ntdll.dll and the C runtime being used.
8 seconds?? I tried the same in Debug mode. Not more than half a second I guess. Are you sure it's the destructors?
FYI. Visual Studio 2008 SP1, Core 2 Duo 6700 CPU with 2GB of RAM.
makes no sense to me - attaching a debugger to a random binary in a normal configuration should mostly just trap breakpoint interrupts (asm int 3, etc).