python-asyncio

Is it possible to limit the number of coroutines running corcurrently in asyncio?

我们两清 提交于 2020-02-20 06:39:10
问题 I already wrote my script using asyncio but found that the number of coroutines running simultaneously is too large and it often ends up hanging around. So I would like to limit the number of coroutines concurrently, and once it reaches the limit, I want to wait for any coroutine to be finished before another is executed. My current code is something like the following: loop = asyncio.get_event_loop() p = map(my_func, players) result = loop.run_until_complete(asyncio.gather(*p)) async def my

Why should asyncio.StreamWriter.drain be explicitly called?

放肆的年华 提交于 2020-02-18 05:30:22
问题 From doc: https://docs.python.org/3/library/asyncio-stream.html#asyncio.StreamWriter.write write(data) Write data to the stream. This method is not subject to flow control. Calls to write() should be followed by drain(). coroutine drain() Wait until it is appropriate to resume writing to the stream. Example: writer.write(data) await writer.drain() From what I understand, You need to call drain every time write is called. If not I guess, write will block the loop thread Then why is write not a

Why should asyncio.StreamWriter.drain be explicitly called?

安稳与你 提交于 2020-02-18 05:29:28
问题 From doc: https://docs.python.org/3/library/asyncio-stream.html#asyncio.StreamWriter.write write(data) Write data to the stream. This method is not subject to flow control. Calls to write() should be followed by drain(). coroutine drain() Wait until it is appropriate to resume writing to the stream. Example: writer.write(data) await writer.drain() From what I understand, You need to call drain every time write is called. If not I guess, write will block the loop thread Then why is write not a

Running URL Requests in Parallel with Flask

爱⌒轻易说出口 提交于 2020-02-07 03:08:48
问题 asyncio is still relatively new for me. I am starting with the basics - simple HTTP hello world - just making approximately 40 parallel GET requests and fetching the first 400 characters of the HTTP responses using Flask ("parallel" function is invoked by request). It is running on python 3.7. The Traceback is showing errors I don't understand. Which "Constructor parameter should be str" is this referring to? How should I proceed? This is the entire code of the app: import aiohttp import

Python3.5 Asyncio - Preventing task exception from dumping to stdout?

与世无争的帅哥 提交于 2020-02-03 10:55:56
问题 I have a textbased interface (asciimatics module) for my program that uses asyncio and discord.py module and occasionally when my wifi adapter goes down I get an exception like so: Task exception was never retrieved future: <Task finished coro=<WebSocketCommonProtocol.run() done, defined at /home/mike/.local/lib/python3.5/site-packages/websockets/protocol.py:428> exception=ConnectionResetError(104, 'Connection reset by peer')> Traceback (most recent call last): File "/usr/lib/python3.5

Python3.5 Asyncio - Preventing task exception from dumping to stdout?

不羁的心 提交于 2020-02-03 10:54:47
问题 I have a textbased interface (asciimatics module) for my program that uses asyncio and discord.py module and occasionally when my wifi adapter goes down I get an exception like so: Task exception was never retrieved future: <Task finished coro=<WebSocketCommonProtocol.run() done, defined at /home/mike/.local/lib/python3.5/site-packages/websockets/protocol.py:428> exception=ConnectionResetError(104, 'Connection reset by peer')> Traceback (most recent call last): File "/usr/lib/python3.5

Python3.5 Asyncio - Preventing task exception from dumping to stdout?

不羁岁月 提交于 2020-02-03 10:54:06
问题 I have a textbased interface (asciimatics module) for my program that uses asyncio and discord.py module and occasionally when my wifi adapter goes down I get an exception like so: Task exception was never retrieved future: <Task finished coro=<WebSocketCommonProtocol.run() done, defined at /home/mike/.local/lib/python3.5/site-packages/websockets/protocol.py:428> exception=ConnectionResetError(104, 'Connection reset by peer')> Traceback (most recent call last): File "/usr/lib/python3.5

Python3.5 Asyncio - Preventing task exception from dumping to stdout?

老子叫甜甜 提交于 2020-02-03 10:52:05
问题 I have a textbased interface (asciimatics module) for my program that uses asyncio and discord.py module and occasionally when my wifi adapter goes down I get an exception like so: Task exception was never retrieved future: <Task finished coro=<WebSocketCommonProtocol.run() done, defined at /home/mike/.local/lib/python3.5/site-packages/websockets/protocol.py:428> exception=ConnectionResetError(104, 'Connection reset by peer')> Traceback (most recent call last): File "/usr/lib/python3.5

AIOHTTP - Application.make_handler(…) is deprecated - Adding Multiprocessing

白昼怎懂夜的黑 提交于 2020-02-02 15:45:29
问题 I went down a journey of "How much performance can I squeeze out of a Python web-server?" This lead me to AIOHTTP and uvloop. Still, I could see that AIOHTTP wasn't using my CPU to its full potential. I set out to use multiprocessing with AIOHTTP. I learned that there's a Linux kernel feature that allows multiple processes to share the same TCP port. This lead me to develop the following code (Which works wonderfully): import asyncio import os import socket import time from aiohttp import web

AIOHTTP - Application.make_handler(…) is deprecated - Adding Multiprocessing

允我心安 提交于 2020-02-02 15:45:10
问题 I went down a journey of "How much performance can I squeeze out of a Python web-server?" This lead me to AIOHTTP and uvloop. Still, I could see that AIOHTTP wasn't using my CPU to its full potential. I set out to use multiprocessing with AIOHTTP. I learned that there's a Linux kernel feature that allows multiple processes to share the same TCP port. This lead me to develop the following code (Which works wonderfully): import asyncio import os import socket import time from aiohttp import web