python-asyncio

Read in parallel and write sequentially?

限于喜欢 提交于 2021-02-07 10:30:33
问题 I have the following code which read and write for each id sequentially. async def main(): while id < 1000: data = await read_async(id) await data.write_async(f'{id}.csv') id += 1 read_async() takes several minutes and write_async() takes less than one minute to run. Now I want to Run read_async(id) in parallel. However, at most 3 calls can be run in parallel because of memory limitation. write_async has to be run sequentially, i.e., write_async(n+1) cannot be run before write_async(n) . 回答1:

Read in parallel and write sequentially?

我只是一个虾纸丫 提交于 2021-02-07 10:30:26
问题 I have the following code which read and write for each id sequentially. async def main(): while id < 1000: data = await read_async(id) await data.write_async(f'{id}.csv') id += 1 read_async() takes several minutes and write_async() takes less than one minute to run. Now I want to Run read_async(id) in parallel. However, at most 3 calls can be run in parallel because of memory limitation. write_async has to be run sequentially, i.e., write_async(n+1) cannot be run before write_async(n) . 回答1:

python asyncio gets deadlock if multiple stdin input is needed

不打扰是莪最后的温柔 提交于 2021-02-07 05:27:19
问题 I wrote a command-line tool to execute git pull for multiple git repos using python asyncio. It works fine if all repos have ssh password-less login setup. It also works fine if only 1 repo needs password input. When multiple repos require password input, it seems to get deadlock. My implementation is very simple. The main logic is utils.exec_async_tasks( utils.run_async(path, cmds) for path in repos.values()) where run_async creates and awaits a subprocess call, and exec_async_tasks runs all

How to cancel all remaining tasks in gather if one fails?

不打扰是莪最后的温柔 提交于 2021-02-07 05:10:23
问题 In case one task of gather raises an exception, the others are still allowed to continue. Well, that's not exactly what I need. I want to distinguish between errors that are fatal and need to cancel all remaining tasks, and errors that are not and instead should be logged while allowing other tasks to continue. Here is my failed attempt to implement this: from asyncio import gather, get_event_loop, sleep class ErrorThatShouldCancelOtherTasks(Exception): pass async def my_sleep(secs): await

How to cancel all remaining tasks in gather if one fails?

廉价感情. 提交于 2021-02-07 05:07:41
问题 In case one task of gather raises an exception, the others are still allowed to continue. Well, that's not exactly what I need. I want to distinguish between errors that are fatal and need to cancel all remaining tasks, and errors that are not and instead should be logged while allowing other tasks to continue. Here is my failed attempt to implement this: from asyncio import gather, get_event_loop, sleep class ErrorThatShouldCancelOtherTasks(Exception): pass async def my_sleep(secs): await

Difference between multiprocessing, asyncio and concurrency.futures in python

无人久伴 提交于 2021-02-06 09:18:48
问题 Being new to using concurrency, I am confused about when to use the different python concurrency libraries. To my understanding, multiprocessing, multithreading and asynchronous programming are part of concurrency, while multiprocessing is part of a subset of concurrency called parallelism. I searched around on the web about different ways to approach concurrency in python, and I came across the multiprocessing library, concurrenct.futures' ProcessPoolExecutor() and ThreadPoolExecutor(), and

asyncio and coroutines vs task queues

南笙酒味 提交于 2021-02-06 07:56:51
问题 I've been reading about asyncio module in python 3, and more broadly about coroutines in python, and I can't get what makes asyncio such a great tool. I have the feeling that all you can do with coroutines, you can do it better by using task queues based of the multiprocessing module (celery for example). Are there usecases where coroutines are better than task queues ? 回答1: Not a proper answer, but a list of hints that could not fit into a comment: You are mentioning the multiprocessing

asyncio and coroutines vs task queues

走远了吗. 提交于 2021-02-06 07:55:27
问题 I've been reading about asyncio module in python 3, and more broadly about coroutines in python, and I can't get what makes asyncio such a great tool. I have the feeling that all you can do with coroutines, you can do it better by using task queues based of the multiprocessing module (celery for example). Are there usecases where coroutines are better than task queues ? 回答1: Not a proper answer, but a list of hints that could not fit into a comment: You are mentioning the multiprocessing

asyncio and coroutines vs task queues

倖福魔咒の 提交于 2021-02-06 07:55:09
问题 I've been reading about asyncio module in python 3, and more broadly about coroutines in python, and I can't get what makes asyncio such a great tool. I have the feeling that all you can do with coroutines, you can do it better by using task queues based of the multiprocessing module (celery for example). Are there usecases where coroutines are better than task queues ? 回答1: Not a proper answer, but a list of hints that could not fit into a comment: You are mentioning the multiprocessing

Start an async function inside a new thread

試著忘記壹切 提交于 2021-02-05 12:17:23
问题 I'm trying to create a discord bot and I need to run an async function in another new Thread since the main Thread is required to run another function (Discord Client) What I'm trying to accomplish: # This methods needs to run in another thread async def discord_async_method(): while True: sleep(10) print("Hello World") ... # Discord Async Logic # This needs to run in the main thread client.run(TOKEN) thread = "" try: # This does not work, throws error "printHelloWorld Needs to be awaited"