I\'d like to create a program that runs multiple light threads, but limits itself to a constant, predefined number of concurrent running tasks, like this (but with no risk o
It would be much easier to implement this as a thread pool or executor, using either multiprocessing.dummy.Pool
, or concurrent.futures.ThreadPoolExecutor
(or, if using Python 2.x, the backport futures). For example:
import concurrent
def f(arg):
print("Started a task. running=%s, arg=%s" % (running, arg))
for i in range(100000):
pass
print("Done")
with concurrent.futures.ThreadPoolExecutor(8) as executor:
while True:
arg = get_task()
executor.submit(f, arg)
Of course if you can change the pull-model get_task
to a push-model get_tasks
that, e.g., yields tasks one at a time, this is even simpler:
with concurrent.futures.ThreadPoolExecutor(8) as executor:
for arg in get_tasks():
executor.submit(f, arg)
When you run out of tasks (e.g., get_task
raises an exception, or get_tasks
runs dry), this will automatically tell the executor to stop after it drains the queue, wait for it to stop, and clean up everything.