python multiprocessing - process hangs on join for large queue

前端 未结 4 1611
攒了一身酷
攒了一身酷 2020-12-14 07:10

I\'m running python 2.7.3 and I noticed the following strange behavior. Consider this minimal example:

from multiprocessing import Process, Queue

def foo(qi         


        
4条回答
  •  青春惊慌失措
    2020-12-14 07:54

    There must be a limit on the size of queues. Consider the following modification:

    from multiprocessing import Process, Queue
    
    def foo(qin,qout):
        while True:
            bar = qin.get()
            if bar is None:
                break
            #qout.put({'bar':bar})
    
    if __name__=='__main__':
        import sys
    
        qin=Queue()
        qout=Queue()   ## POSITION 1
        for i in range(100):
            #qout=Queue()   ## POSITION 2
            worker=Process(target=foo,args=(qin,))
            worker.start()
            for j in range(1000):
                x=i*100+j
                print x
                sys.stdout.flush()
                qin.put(x**2)
    
            qin.put(None)
            worker.join()
    
        print 'Done!'
    

    This works as-is (with qout.put line commented out). If you try to save all 100000 results, then qout becomes too large: if I uncomment out the qout.put({'bar':bar}) in foo, and leave the definition of qout in POSITION 1, the code hangs. If, however, I move qout definition to POSITION 2, then the script finishes.

    So in short, you have to be careful that neither qin nor qout becomes too large. (See also: Multiprocessing Queue maxsize limit is 32767)

提交回复
热议问题