When you map
an iterable to a multiprocessing.Pool
are the iterations divided into a queue for each process in the pool at the start, or is there a
http://docs.python.org/2/library/multiprocessing.html#multiprocessing.pool.multiprocessing.Pool.map
map(func, iterable[, chunksize])
This method chops the iterable into a number of chunks which it submits to the process pool as separate tasks. The (approximate) size of these chunks can be specified by setting chunksize to a positive integer.
I presume a process picks up the next chunk from a queue when done with previous chunk.
The default chunksize
depends on the length of iterable
and is chosen so that the number of chunks is approximately four times the number of processes. (source)