In light of this article, I am wondering what people\'s experiences are with storing massive datasets (say, >10,000,000 objects) in-memory using arrays to store data fields inst
I've done such a thing for the rapidSTORM project, where several million sparsely populated objects need to be cached (localization microscopy). While I can't really give you good code snippets (too many dependencies), I found that the implementation was very quick and straightforward with Boost Fusion. Fusionized the structure, built a vector for each element type, and then wrote a quite straightforward accessor for that vector that reconstructed each element.
(D'oh, I just noticed that you tagged the question, but maybe my C++ answer helps as well)