python-asyncio

How to parallelize the for loop inside a async function and track for loop execution status?

荒凉一梦 提交于 2021-01-20 20:13:30
问题 Recently, I have asked a question regarding how to track the progress of a for loop inside a API deployed. Here's the link. The solution code that worked for me is, from fastapi import FastAPI, UploadFile from typing import List import asyncio import uuid context = {'jobs': {}} app = FastAPI() async def do_work(job_key, files=None): iter_over = files if files else range(100) for file, file_number in enumerate(iter_over): jobs = context['jobs'] job_info = jobs[job_key] job_info['iteration'] =

How to parallelize the for loop inside a async function and track for loop execution status?

余生颓废 提交于 2021-01-20 20:12:47
问题 Recently, I have asked a question regarding how to track the progress of a for loop inside a API deployed. Here's the link. The solution code that worked for me is, from fastapi import FastAPI, UploadFile from typing import List import asyncio import uuid context = {'jobs': {}} app = FastAPI() async def do_work(job_key, files=None): iter_over = files if files else range(100) for file, file_number in enumerate(iter_over): jobs = context['jobs'] job_info = jobs[job_key] job_info['iteration'] =

Correct usage of asyncio.Condition's wait_for() method

空扰寡人 提交于 2021-01-04 02:48:15
问题 I'm writing a project using Python's asyncio module, and I'd like to synchronize my tasks using its synchronization primitives. However, it doesn't seem to behave as I'd expect. From the documentation, it seems that Condition.wait_for() offers a means by which to allow a coroutine to wait for a particular user-defined condition to evaluate as true. However, on attempting to use the method, it seems to behave in ways I wouldn't expect - my condition is only checked once, and if it is found to

Correct usage of asyncio.Condition's wait_for() method

十年热恋 提交于 2021-01-04 02:47:41
问题 I'm writing a project using Python's asyncio module, and I'd like to synchronize my tasks using its synchronization primitives. However, it doesn't seem to behave as I'd expect. From the documentation, it seems that Condition.wait_for() offers a means by which to allow a coroutine to wait for a particular user-defined condition to evaluate as true. However, on attempting to use the method, it seems to behave in ways I wouldn't expect - my condition is only checked once, and if it is found to

asyncio/aiohttp - create_task() blocks event loop, gather results in “This event loop is already running ”

落花浮王杯 提交于 2021-01-01 08:15:08
问题 I cannot get both my consumer and my producer running at the same time, it seems worker(), or the aiohttp server are blocking - even when executed simultaneously with asyncio.gather() If instead I do loop.create_task(worker), this will block and server will never be started. I've tried every variation I can imagine, including nest_asyncio module - and I can only ever get one of the two components running. What am I doing wrong? async def worker(): batch_size = 30 print("running worker") while

asyncio/aiohttp - create_task() blocks event loop, gather results in “This event loop is already running ”

假如想象 提交于 2021-01-01 08:12:18
问题 I cannot get both my consumer and my producer running at the same time, it seems worker(), or the aiohttp server are blocking - even when executed simultaneously with asyncio.gather() If instead I do loop.create_task(worker), this will block and server will never be started. I've tried every variation I can imagine, including nest_asyncio module - and I can only ever get one of the two components running. What am I doing wrong? async def worker(): batch_size = 30 print("running worker") while

Executing run_coroutine_threadsafe in a separate thread

丶灬走出姿态 提交于 2021-01-01 04:20:09
问题 I have a script that is constantly running forever (it checks changes in files). I need to send Discord messages whenever a weird file is made. Problem is, the event watching function ( def run(self): below) is from a subclass, so I can't change it to async def run(self): . Therefore I can't use await channel.send() My solution to this was to use run_coroutine_threadsafe like explained here: https://stackoverflow.com/a/53726266/9283107. That works good! But the problem is, the messages get

Executing run_coroutine_threadsafe in a separate thread

别来无恙 提交于 2021-01-01 04:20:08
问题 I have a script that is constantly running forever (it checks changes in files). I need to send Discord messages whenever a weird file is made. Problem is, the event watching function ( def run(self): below) is from a subclass, so I can't change it to async def run(self): . Therefore I can't use await channel.send() My solution to this was to use run_coroutine_threadsafe like explained here: https://stackoverflow.com/a/53726266/9283107. That works good! But the problem is, the messages get

Why is python pickle failing to dump dictionaries in 3.7

心已入冬 提交于 2020-12-30 03:12:57
问题 I recently switched to version 3.7.3 for python and I've been attempting to update the code to fit the new version. Before switching, pickle would have no problem dumping and loading the dictionaries I've sent it but now it keeps giving me a TypeError: can't pickle TaskStepMethWrapper objects error Searching for TaskStepMethWrapper shows that it may be related to asyncio, but this error did not show up when I was using python 3.6. Heres my code def load_guildlist(self): with open("guildlist

Why is python pickle failing to dump dictionaries in 3.7

落爺英雄遲暮 提交于 2020-12-30 03:12:53
问题 I recently switched to version 3.7.3 for python and I've been attempting to update the code to fit the new version. Before switching, pickle would have no problem dumping and loading the dictionaries I've sent it but now it keeps giving me a TypeError: can't pickle TaskStepMethWrapper objects error Searching for TaskStepMethWrapper shows that it may be related to asyncio, but this error did not show up when I was using python 3.6. Heres my code def load_guildlist(self): with open("guildlist