bad-alloc

Error when using Decision Trees in OpenCV 3.0.0-rc1

扶醉桌前 提交于 2021-02-19 06:01:15
问题 I am doing some machine learning in OpenCV and i'm using Decision Trees . I am currently using OpenCV 3.0.0-rc1 . Whenever i attempt to train Decision Trees with my training data and labels, i get either terminate called after throwing an instance of 'std::bad_alloc' what(): std::bad_alloc or Segmentation fault Depending on what i put into setMaxDepth(); if the number is larger than 22, it's bad_alloc, else it's seg fault. Here's my source code: //import data Mat trainData=imread("/home

Trying to code Graph in c++, getting bad_alloc some of the time

不问归期 提交于 2021-02-08 12:01:20
问题 I'm new to c++ after learning basic Object Oriented Programming in Java so I'm having a difficult time grasping memory deallocation. The assignment was to create a Weighted Directed Graph... I'm getting the error: "terminate called after throwing an instance of 'std::bad_alloc' what(): std::bad_alloc" when I run certain inputs through my code, and I'm having a difficult time figuring out what is causing it. I googled the error and found that it was a memory problem, so I attempted to go

What is the most common reason that “bad_alloc” is thrown?

邮差的信 提交于 2020-04-10 18:05:26
问题 Is there a more frequent error that is memory-related that throws bad_alloc? I understand that it means that memory-allocation has failed, but what is the most common mistake that leads to this in code? 回答1: EDIT: The other commenters have pointed out a few interesting scenarios. I'm adding them to my response for the sake of completeness. Case 1: Running out of memory My understanding is that bad_alloc is thrown whenever the operators new and new[] fail to allocate memory to an object or

Out of memory (?) problem on Win32 (vs. Linux)

此生再无相见时 提交于 2020-01-06 08:23:25
问题 I have the following problem: A program run on a windows machine (32bit, 3.1Gb memory, both VC++2008 and mingw compiled code) fails with a bad_alloc exception thrown (after allocating around 1.2Gb; the exception is thrown when trying to allocate a vector of 9 million doubles, i.e. around 75Mb) with plenty of RAM still available (at least according to task manager). The same program run on linux machines (32bit, 4Gb memory; 32bit, 2Gb memory) runs fine with peak memory usage of around 1.6Gb.

Reading large (~1GB) data file with C++ sometimes throws bad_alloc, even if I have more than 10GB of RAM available

无人久伴 提交于 2020-01-02 22:01:01
问题 I'm trying to read the data contained in a .dat file with size ~1.1GB. Because I'm doing this on a 16GB RAM machine, I though it would have not be a problem to read the whole file into memory at once, to only after process it. To do this, I employed the slurp function found in this SO answer. The problem is that the code sometimes, but not always, throws a bad_alloc exception. Looking at the task manager I see that there are always at least 10GB of free memory available, so I don't see how

Allocating large blocks of memory with new

…衆ロ難τιáo~ 提交于 2019-12-21 12:29:03
问题 I have the need to allocate large blocks of memory with new. I am stuck with using new because I am writing a mock for the producer side of a two part application. The actual producer code is allocating these large blocks and my code has responsibility to delete them (after processing them). Is there a way I can ensure my application is capable of allocating such a large amount of memory from the heap? Can I set the heap to a larger size? My case is 64 blocks of 288000 bytes. Sometimes I am

what(): std::bad_alloc - am I out of memory?

北战南征 提交于 2019-12-20 04:37:10
问题 My dataset: 500,000 points in 960 dimensions. The size of the file is 1.9 GB (1,922,000,000 bytes). The code works for smaller data sets, but for this it will crash in the same point every time. Here is a minimal example: #include <iostream> #include <vector> template<typename T> class Division_Euclidean_space { public: /** * The data type. */ typedef T FT; /** * Constructor, which * sets 'N' and 'D' to zero. */ Division_Euclidean_space() : N(0), D(0) { } /** * @param n - size of data */ void