问题
i read in other answers that theres no limit imposed by c++ compiler maximum size of std::vector. i am trying to use vector for one purpose, and in need to have 10^19 items.
typedef struct{
unsigned long price, weight;
}product;
//inside main
unsigned long long n = 930033404565174954;
vector<product> psorted(n);
the program breaks on the last statement. if i try resize(n)
instead of initializing with n
then also program breaks with message :
vector<T> too long
std::length_error at memory location
i need to sort the data accourding to price after putting in vector. what should i do ?
回答1:
std::vector
does have limits on how much stuff it can carry. You can query this with std::vector::max_size
, which returns the maximum size you can use.
10^19 items.
Do you have 10^19 * sizeof(product)
memory? I'm guessing that you don't have ~138 Exabytes of RAM. Plus, you'd have to be compiling in 64-bit mode to even consider allocating that much. The compiler isn't breaking; your execution is breaking for trying to allocate too much stuff.
回答2:
Others have already told you what the problem is. One possible solution is to use the STXXL library, which is an implementation of STL that's designed for huge, out-of-memory datasets.
However, 10^19 8-byte items is 80 million TB. I'm not sure anyone has a disk that large...
Also, assuming a generous disk bandwidth of 300MB/s, this would take 8000 years to write!
来源:https://stackoverflow.com/questions/8955400/stl-vectort-too-long