This is my little big question about containers, in particular, arrays.
I am writing a physics code that mainly manipulates a big (> 1 000 000) set of \"particles\" (wit
The first rule when choosing from containers is to use std::vector
. Then, only after your code is complete and you can actually measure performance, you can try other containers. But stick to vector first. (And use reserve()
from the start)
Then, you shouldn't use an std::vector
. You know the size of your data: it's 6 doubles. No need for it to be dynamic. It is constant and fixed. You can define a struct to hold you particle members (the six doubles), or you can simply typedef it: typedef double particle[6]
. Then, use a vector of particles: std::vector
.
Furthermore, as your program uses the particle data contained in the vector sequentially, you will take advantage of the modern CPU cache read-ahead feature at its best performance.