问题
I have a class which requiring a large amount of memory.
class BigClass {
public:
BigClass() {
bf1[96000000-1] = 1;
}
double bf1[96000000];
};
I can only initiate the class by "new" a object in heap memory.
BigClass *c = new BigClass();
assert( c->bf1[96000000-1] == 1 );
delete c;
If I initiate it without "new". I will get a segmentation fault in runtime.
BigClass c; // SIGSEGV!
How can I determine the memory limit? or should I better always use "new"?
回答1:
The stack have a fixed size that is dependant on the compiler options. See your compiler documentation to change the stack size for your executable.
Anyway, for big objects, prefer using new or better : smart pointers like shared_pointer (from boost or from std::tr1 or std:: if you have very recent compiler).
回答2:
First of all since you've entitled this C++ and not C why are you using arrays? Instead may I suggest vector<double>
or, if contiguous memory is causing problems deque<double>
which relaxes the constraint on contiguous memory without removing the nearly constant time lookup.
Using vector
or deque
may also alleviate other seg fault issues which could plague your project at a later date. For instance, overrunning bounds in your array. If you convert to using vector
or deque
you can use the .at(x)
member function to retrieve and set values in your collection. Should you attempt to write out of bounds, that function will throw an error.
回答3:
You shouldn't play that game ever. Your code could be called from another function or on a thread with a lower stack size limit and then your code will break nastily. See this closely related question.
If you're in doubt use heap-allocation (new
) - either directly with smart pointers (like auto_ptr
) or indirectly using std::vector
.
回答4:
There is no platform-independent way of determining the memory limit. For "large" amounts of memory, you're far safer allocating on the heap (i.e. using new
); you can check for success by comparing the resulting pointer against NULL
, or catching std::bad_alloc
exceptions.
回答5:
The way your class is designed is, as you discovered, quite fragile. Instead of always allocating your objects on the heap, instead your class itself should allocate the huge memory block on the heap, preferably with std::vector
, or possibly with a shared_ptr
if vector
doesn't work for some reason. Then you don't have to worry about how your clients use the object, it's safe to put on the stack or the heap.
回答6:
On Linux, in the Bash shell, you can check the stack size with ulimit -s
. Variables with automatic storage duration will have their space allocated on the stack. As others have said, there are better ways of approaching this:
- Use a
std::vector
to hold your data inside yourBigClass
. - Allocate the memory for
bf1
insideBigClass
's constructor and then free it in the destructor. If you must have a large
double[]
member, allocate an instance of BigClass with some kind of smart pointer; if you don't need shared access something as simple asstd::auto_ptr
will let you safely construct/destroy your object:std::auto_ptr<BigClass>(new BigClass) myBigClass; myBigClass->bf1; // your array
来源:https://stackoverflow.com/questions/4106655/how-large-is-the-attributes-can-a-class-object-hold-how-to-determine-the-stack