aiohttp

Python aiohttp/asyncio - how to process returned data

一个人想着一个人 提交于 2019-12-03 02:02:21
Im in the process of moving some synchronous code to asyncio using aiohttp. the synchronous code was taking 15 minutes to run, so I'm hoping to improves this. I have some working code which gets data from some urls and returns the body of each. But this is just against 1 lab site, I have 70+ actual sites. So if I got a loop to create a list of all the urls for all sites that would make 700 urls in a list to be processed. Now processing them I don't think is a problem? But doing 'stuff' with the results, I'm not sure how to program? I have code already that will do 'stuff' to each of the

How can I wrap a synchronous function in an async coroutine?

南笙酒味 提交于 2019-12-02 17:30:30
I'm using aiohttp to build an API server that sends TCP requests off to a seperate server. The module that sends the TCP requests is synchronous and a black box for my purposes. So my problem is that these requests are blocking the entire API. I need a way to wrap the module requests in an asynchronous coroutine that won't block the rest of the API. So, just using sleep as a simple example, is there any way to somehow wrap time-consuming synchronous code in a non-blocking coroutine, something like this: async def sleep_async(delay): # After calling sleep, loop should be released until sleep is

specify log request format in aiohttp 2

≡放荡痞女 提交于 2019-12-02 03:42:59
问题 I'm using aiohttp 2 with Python 3.6 and want to log the requests coming to the application. I did: # use ISO timestamps from time import gmtime logging.Formatter.converter = gmtime # create a formatter ch = logging.StreamHandler() formatter = logging.Formatter('%(asctime)s %(levelname)s %(name)s - %(message)s', '%Y-%m-%dT%H:%M:%S') ch.setFormatter(formatter) # show all emssages (default is WARNING) logging.getLogger('aiohttp.access').setLevel(logging.DEBUG) # attach the handler logging

async - sync - async calls in one python event loop

大城市里の小女人 提交于 2019-12-01 17:19:29
问题 Let's say I have a class which uses asyncio loop internally and doesn't have async interface: class Fetcher: _loop = None def get_result(...): """ After 3 nested sync calls async tasks are finally called with *run_until_complete* """ ... I use all advantages of asyncio internally and don't have to care about it in the outer code. But then I want to call 3 Fetcher instances in one event loop. If I had async def interface there would be no problem: asyncio.gather could help me. Is there really

Parallel asynchronous IO in Python's coroutines

前提是你 提交于 2019-12-01 17:15:21
Simple example: I need to make two unrelated HTTP requests in parallel. What's the simplest way to do that? I expect it to be like that: async def do_the_job(): with aiohttp.ClientSession() as session: coro_1 = session.get('http://httpbin.org/get') coro_2 = session.get('http://httpbin.org/ip') return combine_responses(await coro_1, await coro_2) In other words, I want to initiate IO operations and wait for their results so they effectively run in parallel. This can be achieved with asyncio.gather : async def do_the_job(): with aiohttp.ClientSession() as session: coro_1 = session.get('http:/

aiohttp - before request for each API call

人走茶凉 提交于 2019-12-01 13:26:39
问题 When I was using Flask, every API call is authenticated before processed: app = connexion.App(__name__, specification_dir='./swagger/', swagger_json=True, swagger_ui=True, server='tornado') app.app.json_encoder = encoder.JSONEncoder app.add_api('swagger.yaml', arguments={'title': 'ABCD API'}) # add CORS support CORS(app.app) @app.app.before_request def before_request_func(): app_id = request.headers.get("X-AppId") token = request.headers.get("X-Token") user, success = security.Security()

Maximize number of parallel requests (aiohttp)

心不动则不痛 提交于 2019-12-01 01:53:11
tl;dr : how do I maximize number of http requests I can send in parallel? I am fetching data from multiple urls with aiohttp library. I'm testing its performance and I've observed that somewhere in the process there is a bottleneck, where running more urls at once just doesn't help. I am using this code: import asyncio import aiohttp async def fetch(url, session): headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 6.3; Win64; x64; rv:64.0) Gecko/20100101 Firefox/64.0'} try: async with session.get( url, headers=headers, ssl = False, timeout = aiohttp.ClientTimeout( total=None, sock_connect = 10,

How to use an aiohttp ClientSession with Sanic?

佐手、 提交于 2019-11-30 15:43:16
I am trying to understand what is the right way to use aiohttp with Sanic. From aiohttp documentation , I find the following: Don’t create a session per request . Most likely you need a session per application which performs all requests altogether. More complex cases may require a session per site, e.g. one for Github and another one for Facebook APIs. Anyway making a session for every request is a very bad idea. A session contains a connection pool inside. Connection reuse and keep-alive (both are on by default) may speed up total performance. And when I go to Sanic documentation I find an

ssl/asyncio: traceback even when error is handled

℡╲_俬逩灬. 提交于 2019-11-30 14:03:01
问题 Trying to download and process jpegs from URLs. My issue isn't that certificate verification fails for some URLs, as these URLs are old and may no longer be trustworthy, but that when I try...except... the SSLCertVerificationError , I still get the traceback. System: Linux 4.17.14-arch1-1-ARCH, python 3.7.0-3, aiohttp 3.3.2 Minimal example: import asyncio import aiohttp from ssl import SSLCertVerificationError async def fetch_url(url, client): try: async with client.get(url) as resp: print

asyncio aiohttp progress bar with tqdm

六眼飞鱼酱① 提交于 2019-11-30 13:01:30
问题 I'm attempting to integrate a tqdm progress bar to monitor POST requests generated with aiohttp in Python 3.5. I have a working progress bar but can't seem to gather results using as_completed() . Pointers gratefully received. Examples I've found suggest using the following pattern, which is incompatible with Python 3.5 async def definitions: for f in tqdm.tqdm(asyncio.as_completed(tasks), total=len(coros)): yield from f Working (albeit redacted) async code without the progress bar: def async