If a malloc allocation fails, should we try it again?
In something like this:
char* mystrdup(const char *s)
{
char *ab = NULL;
Without arguing why or when this would be useful, attempts to reallocate in a loop could work, at least on Windows with 64 bit code, and default pagefile settings. Moreover, this could buy surprisingly more additional virtual memory. Although, do not do this in an infinite loop, but instead use a finite number of retries. As a proof, try the following code that simulates leaking 1 Mb of memory. You have to run it in Release build, preferably not under debugger.
for (int i = 0; i < 10; i++)
{
size_t allocated = 0;
while (1)
{
void* p = malloc(1024 * 1024);
if (!p)
break;
allocated += 1;
}
//This prints only after malloc had failed.
std::cout << "Allocated: " << allocated << " Mb\n";
//Sleep(1000);
}
On my machine with 8 Gb of RAM and system managed pagefile, I get the following output (built with VS2013 for x64 target, tested on Windows 7 Pro):
Allocated: 14075 Mb
Allocated: 16 Mb
Allocated: 2392 Mb
Allocated: 3 Mb
Allocated: 2791 Mb
Allocated: 16 Mb
Allocated: 3172 Mb
Allocated: 16 Mb
Allocated: 3651 Mb
Allocated: 15 Mb
I don't know exact reason of such behavior, but it seems allocations start failing once the pagefile resizing cannot keep up with requests. On my machine, pagefile grew from 8 gb to 20 Gb after this loop (drops back to 8 Gb after the program quits).