Memory usage keep growing with Python's multiprocessing.pool
Here's the program: #!/usr/bin/python import multiprocessing def dummy_func(r): pass def worker(): pass if __name__ == '__main__': pool = multiprocessing.Pool(processes=16) for index in range(0,100000): pool.apply_async(worker, callback=dummy_func) # clean up pool.close() pool.join() I found memory usage (both VIRT and RES) kept growing up till close()/join(), is there any solution to get rid of this? I tried maxtasksperchild with 2.7 but it didn't help either. I have a more complicated program that calles apply_async() ~6M times, and at ~1.5M point I've already got 6G+ RES, to avoid all other