Python's multiprocessing and memory

前端 未结 2 888
梦如初夏
梦如初夏 2021-02-07 03:48

I am using multiprocessing.imap_unordered to perform a computation on a list of values:

def process_parallel(fnc, some_list):
    pool = multiproces         


        
2条回答
  •  轮回少年
    2021-02-07 04:20

    As you can see by looking into the corresponding source file (python2.7/multiprocessing/pool.py), the IMapUnorderedIterator uses a collections.deque instance for storing the results. If a new item comes in, it is added and removed in the iteration.

    As you suggested, if another huge object comes in while the main thread is still processing the object, those will be stored in memory too.

    What you might try is something like this:

    it = pool.imap_unordered(fnc, some_list)
    for result in it:
        it._cond.acquire()
        for x in result:
            yield x
        it._cond.release()
    

    This should cause the task-result-receiver-thread to get blocked while you process an item if it is trying to put the next object into the deque. Thus there should not be more than two of the huge objects in memory. If that works for your case, I don't know ;)

提交回复
热议问题