Python multiple subprocess with a pool/queue recover output as soon as one finishes and launch next job in queue
问题 I'm currently launching a subprocess and parsing stdout on the go without waiting for it to finish to parse stdout. for sample in all_samples: my_tool_subprocess = subprocess.Popen('mytool {}'.format(sample),shell=True, stdout=subprocess.PIPE) line = True while line: myline = my_tool_subprocess.stdout.readline() #here I parse stdout.. In my script I perform this action multiple times, indeed depending on the number of input samples. Main problem here is that every subprocess is a program/tool