Multiprocessing Queue in Python

后端 未结 5 901
抹茶落季
抹茶落季 2020-12-07 23:48

I\'m trying to use a queue with the multiprocessing library in Python. After executing the code below (the print statements work), but the processes do not quit after I call

5条回答
  •  醉酒成梦
    2020-12-08 00:28

    Here is a sentinel-free method for the relatively simple case where you put a number of tasks on a JoinableQueue, then launch worker processes that consume the tasks and exit once they read the queue "dry". The trick is to use JoinableQueue.get_nowait() instead of get(). get_nowait(), as the name implies, tries to get a value from the queue in a non-blocking manner and if there's nothing to be gotten then a queue.Empty exception is raised. The worker handles this exception by exiting.

    Rudimentary code to illustrate the principle:

    import multiprocessing as mp
    from queue import Empty
    
    def worker(q):
      while True:
        try:
          work = q.get_nowait()
          # ... do something with `work`
          q.task_done()
        except Empty:
          break # completely done
    
    # main
    worknum = 4
    jq = mp.JoinableQueue()
    
    # fill up the task queue
    # let's assume `tasks` contains some sort of data
    # that your workers know how to process
    for task in tasks:
      jq.put(task)
    
    procs = [ mp.Process(target=worker, args=(jq,)) for _ in range(worknum) ]
    for p in procs:
      p.start()
    
    for p in procs:
      p.join()
    

    The advantage is that you do not need to put the "poison pills" on the queue so the code is a bit shorter.

    IMPORTANT : in more complex situations where producers and consumers use the same queue in an "interleaved" manner and the workers may have to wait for new tasks to come along, the "poison pill" approach should be used. My suggestion above is for simple cases where the workers "know" that if the task queue is empty, then there's no point hanging around any more.

提交回复
热议问题