shared-memory

Shared memory in multiprocessing

荒凉一梦 提交于 2019-11-25 21:25:36
I have three large lists. First contains bitarrays (module bitarray 0.8.0) and the other two contain arrays of integers. l1=[bitarray 1, bitarray 2, ... ,bitarray n] l2=[array 1, array 2, ... , array n] l3=[array 1, array 2, ... , array n] These data structures take quite a bit of RAM (~16GB total). If i start 12 sub-processes using: multiprocessing.Process(target=someFunction, args=(l1,l2,l3)) Does this mean that l1, l2 and l3 will be copied for each sub-process or will the sub-processes share these lists? Or to be more direct, will I use 16GB or 192GB of RAM? someFunction will read some

Shared-memory objects in multiprocessing

你说的曾经没有我的故事 提交于 2019-11-25 18:40:44
Suppose I have a large in memory numpy array, I have a function func that takes in this giant array as input (together with some other parameters). func with different parameters can be run in parallel. For example: def func(arr, param): # do stuff to arr, param # build array arr pool = Pool(processes = 6) results = [pool.apply_async(func, [arr, param]) for param in all_params] output = [res.get() for res in results] If I use multiprocessing library, then that giant array will be copied for multiple times into different processes. Is there a way to let different processes share the same array?