Reading large (~1GB) data file with C++ sometimes throws bad_alloc, even if I have more than 10GB of RAM available
I'm trying to read the data contained in a .dat file with size ~1.1GB. Because I'm doing this on a 16GB RAM machine, I though it would have not be a problem to read the whole file into memory at once, to only after process it. To do this, I employed the slurp function found in this SO answer . The problem is that the code sometimes, but not always, throws a bad_alloc exception. Looking at the task manager I see that there are always at least 10GB of free memory available, so I don't see how memory would be an issue. Here is the code that reproduces this error #include <iostream> #include