I am trying to properly understand and implement two concurrently running Task objects using Python 3\'s relatively new asyncio module.
In a nutshell, asyncio seems
Yes, any coroutine that's running inside your event loop will block other coroutines and tasks from running, unless it
yield from
or await
(if using Python 3.5+).This is because asyncio
is single-threaded; the only way for the event loop to run is for no other coroutine to be actively executing. Using yield from
/await
suspends the coroutine temporarily, giving the event loop a chance to work.
Your example code is fine, but in many cases, you probably wouldn't want long-running code that isn't doing asynchronous I/O running inside the event loop to begin with. In those cases, it often makes more sense to use asyncio.loop.run_in_executor to run the code in a background thread or process. ProcessPoolExecutor
would be the better choice if your task is CPU-bound, ThreadPoolExecutor
would be used if you need to do some I/O that isn't asyncio
-friendly.
Your two loops, for example, are completely CPU-bound and don't share any state, so the best performance would come from using ProcessPoolExecutor
to run each loop in parallel across CPUs:
import asyncio
from concurrent.futures import ProcessPoolExecutor
print('running async test')
def say_boo():
i = 0
while True:
print('...boo {0}'.format(i))
i += 1
def say_baa():
i = 0
while True:
print('...baa {0}'.format(i))
i += 1
if __name__ == "__main__":
executor = ProcessPoolExecutor(2)
loop = asyncio.get_event_loop()
boo = asyncio.create_task(loop.run_in_executor(executor, say_boo))
baa = asyncio.create_task(loop.run_in_executor(executor, say_baa))
loop.run_forever()