Python multiprocessing - How to release memory when a process is done?

前端 未结 3 1168
醉酒成梦
醉酒成梦 2020-12-14 15:27

I encountered a weird problem while using python multiprocessing library.

My code is sketched below: I spawn a process for each \"symbol, date\" tuple. I combine th

相关标签:
3条回答
  • 2020-12-14 16:08

    You should probably call close() followed by wait() on your Pool object.

    http://docs.python.org/library/multiprocessing.html#module-multiprocessing.pool

    join() Wait for the worker processes to exit. One must call close() or terminate() before using join().

    0 讨论(0)
  • 2020-12-14 16:13

    Did you try to close pool by using pool.close and then wait for process to finish by pool.join, because if parent process keeps on running and does not wait for child processes they will become zombies

    0 讨论(0)
  • 2020-12-14 16:15

    Try setting the maxtasksperchild argument on the pool. If you don't, then the process is reusued over and over again by the pool so the memory is never released. When set, the process will be allowed to die and a new one created in it's place. That will effectively clean up the memory.

    I guess it's new in 2.7: http://docs.python.org/2/library/multiprocessing.html#module-multiprocessing.pool

    0 讨论(0)
提交回复
热议问题