I encountered a weird problem while using python multiprocessing library.
My code is sketched below: I spawn a process for each \"symbol, date\" tuple. I combine th
You should probably call close()
followed by wait()
on your Pool
object.
http://docs.python.org/library/multiprocessing.html#module-multiprocessing.pool
join()
Wait for the worker processes to exit. One must call close() or terminate() before using join().
Did you try to close pool by using pool.close and then wait for process to finish by pool.join, because if parent process keeps on running and does not wait for child processes they will become zombies
Try setting the maxtasksperchild argument on the pool. If you don't, then the process is reusued over and over again by the pool so the memory is never released. When set, the process will be allowed to die and a new one created in it's place. That will effectively clean up the memory.
I guess it's new in 2.7: http://docs.python.org/2/library/multiprocessing.html#module-multiprocessing.pool