aiohttp

How can I wrap a synchronous function in an async coroutine?

不羁的心 提交于 2019-12-20 08:57:02
问题 I'm using aiohttp to build an API server that sends TCP requests off to a seperate server. The module that sends the TCP requests is synchronous and a black box for my purposes. So my problem is that these requests are blocking the entire API. I need a way to wrap the module requests in an asynchronous coroutine that won't block the rest of the API. So, just using sleep as a simple example, is there any way to somehow wrap time-consuming synchronous code in a non-blocking coroutine, something

Parallel asynchronous IO in Python's coroutines

こ雲淡風輕ζ 提交于 2019-12-19 17:34:22
问题 Simple example: I need to make two unrelated HTTP requests in parallel. What's the simplest way to do that? I expect it to be like that: async def do_the_job(): with aiohttp.ClientSession() as session: coro_1 = session.get('http://httpbin.org/get') coro_2 = session.get('http://httpbin.org/ip') return combine_responses(await coro_1, await coro_2) In other words, I want to initiate IO operations and wait for their results so they effectively run in parallel. This can be achieved with asyncio

How to combine python asyncio with threads?

╄→尐↘猪︶ㄣ 提交于 2019-12-17 15:23:20
问题 I have successfully built a RESTful microservice with Python asyncio and aiohttp that listens to a POST event to collect realtime events from various feeders. It then builds an in-memory structure to cache the last 24h of events in a nested defaultdict/deque structure. Now I would like to periodically checkpoint that structure to disc, preferably using pickle. Since the memory structure can be >100MB I would like to avoid holding up my incoming event processing for the time it takes to

Preventing random 'RuntimeError: Event loop is closed' in automated test

折月煮酒 提交于 2019-12-14 03:12:50
问题 Referring to: How do I avoid the loop argument I am trying to write a SockJS client in python, and I would like that code to have some automated tests. Here is the code: from urllib.parse import urlparse import websockets class SockJsClient: def __init__(self, base_url): self.base_url = urlparse(base_url) async def connect(self): uri = self.base_url.geturl() + "/websocket" self.websocket = await websockets.connect(uri) async def disconnect(self): await self.websocket.close() and here are my

'yield from' inside async function Python 3.6.5 aiohttp

拜拜、爱过 提交于 2019-12-13 09:07:34
问题 SyntaxError: 'yield from' inside async function async def handle(request): for m in (yield from request.post()): print(m) return web.Response() Used python3.5 before, found pep525, install python3.6.5 and still receive this error. 回答1: You are using the new async / await syntax to define and execute co-routines, but have not made a full switch. You need to use await here: async def handle(request): post_data = await request.post() for m in post_data: print(m) return web.Response() If you

How can I detect the closure of an Python aiohttp web socket on the server when the client exits uncleanly

*爱你&永不变心* 提交于 2019-12-13 03:11:24
问题 I have a simple command and control server server.py (completely insecure - don't use), a passive client update_client.py and another client that that can send commands update_commander.py . There is a http endpoint at http://0.0.0.0:8080/ which lists the connected clients. When the update_commander.py script exits its client gets cleaned up properly. When update_client.py disconnects the server doesn't notice the disconnection and on upon further messages sent by the update_commander.py I

How to use SOCKS proxies to make requests with aiohttp?

余生颓废 提交于 2019-12-12 13:15:32
问题 I am trying to use aiohttp to make asynchronous HTTP requests over multiple SOCKS proxies. Basically, I am creating a pool of Tor clients with different IP addresses, and want to be able to route HTTP requests through them using aiohttp . Based on the suggestions here and here, I have been trying to use aiosocks, but the examples in those threads do not work (if they ever did) because they are based on an old version of aiosocks with a different API. Documentation and examples of using

Multiple aiohttp Application()'s running in the same process?

不打扰是莪最后的温柔 提交于 2019-12-12 10:29:55
问题 Can two aiohttp.web.Application() objects be running in the same process, e.g. on different ports? I see a bunch of examples of aiohttp code like: from aiohttp import web app = web.Application() app.router.add_get('/foo', foo_view, name='foo') web.run_app(app, host='0.0.0.0', port=10000) I'm wondering if there's some equivalent where multiple web.Applications() can be configured to run at the same time. Something like: from aiohttp import web app1 = web.Application() app1.router.add_get('/foo

Is there any way to use aiohttp client with socks proxy?

╄→гoц情女王★ 提交于 2019-12-12 08:04:40
问题 Looks like aiohttp.ProxyConnector doesn't support socks proxy. Is there any workaround for this? I would be grateful for any advice. 回答1: Have you tried aiosocks ? import asyncio import aiosocks from aiosocks.connector import SocksConnector conn = SocksConnector(proxy=aiosocks.Socks5Addr(PROXY_ADDRESS, PROXY_PORT), proxy_auth=None, remote_resolve=True) session = aiohttp.ClientSession(connector=conn) async with session.get('http://python.org') as resp: assert resp.status == 200 回答2: aiosocks

Even using asyncio and aiohttp, methods wait for the request response

房东的猫 提交于 2019-12-12 01:25:30
问题 Hi I have the following issue, I want to execute getlastItemFromGivenInterval method, let it briefly to go through without waiting for request reponses, and give a context to asyncio.sleep(60) to execute the whole procedure once more in 60 seconds time frames. What I get is waiting in getLastItemFromGivenInterval() for request end. import aiohttp import asyncio loop = asyncio.get_event_loop() task = loop.create_task(main()) loop.run_forever() async def main(): async with aiohttp.ClientSession