Python multiple subprocess with a pool/queue recover output as soon as one finishes and launch next job in queue

99封情书 提交于 2019-12-05 01:25:50

ThreadPool could be a good fit for your problem, you set the number of worker threads and add jobs, and the threads will work their way through all the tasks.

from multiprocessing.pool import ThreadPool
import subprocess


def work(sample):
    my_tool_subprocess = subprocess.Popen('mytool {}'.format(sample),shell=True, stdout=subprocess.PIPE)
    line = True
    while line:
        myline = my_tool_subprocess.stdout.readline()
        #here I parse stdout..


num = None  # set to the number of workers you want (it defaults to the cpu count of your machine)
tp = ThreadPool(num)
for sample in all_samples:
    tp.apply_async(work, (sample,))

tp.close()
tp.join()
hayder alshawk

well as i understood your question your problem is that the result of the first process after its finished is supplied to the second process, then to the third and so on, to achieve this you should import threading module and use the class Thread:

proc = threading.Thread(target=func, args=(func arguments) # Thread class
proc.start()                                   # starting the thread
proc.join()                                    # this ensures that the next thread does no 

start until the previous one has finished.....

hayder alshawk

well if this is the case you should write the same code above without proc.join() in this case the main thread (main) will start the other four threads, this the case that multithreading in a single process (in other words no benefits of multicore processor) to benefit from multicore processor you should use the multiprocessing module like this:

proc = multiprocessing.Process(target=func, args=(funarguments))      
proc.start()

this way each would be a separate process and separate processes can run completely independently of one another

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!