aiohttp

Python aiohttp been receiving SSL transport errors

雨燕双飞 提交于 2020-01-04 06:31:12
问题 We have an application running that relies heavily on asyncio. It sends hundreds of get requests per minute to mostly the same host, but with different urls. Since about 3 weeks, we observe the following issues: The process gets stuck, often for up to (exactly) 2400 seconds. We observe the following error in the logging: 2018-12-07T23:37:33Z ERROR base_events.py: Fatal error on SSL transport protocol: File "/usr/lib64/python3.6/asyncio/sslproto.py", line 638, in _process_write_backlog ssldata

Python aiohttp been receiving SSL transport errors

╄→гoц情女王★ 提交于 2020-01-04 06:30:15
问题 We have an application running that relies heavily on asyncio. It sends hundreds of get requests per minute to mostly the same host, but with different urls. Since about 3 weeks, we observe the following issues: The process gets stuck, often for up to (exactly) 2400 seconds. We observe the following error in the logging: 2018-12-07T23:37:33Z ERROR base_events.py: Fatal error on SSL transport protocol: File "/usr/lib64/python3.6/asyncio/sslproto.py", line 638, in _process_write_backlog ssldata

Python 3.5 async for blocks the ioloop

て烟熏妆下的殇ゞ 提交于 2020-01-03 17:35:54
问题 I have a simple aiohttp-server with two handlers. First one does some computations in the async for loop. Second one just returns text response. not_so_long_operation returns 30-th fibonacci number with the slowest recursive implementation, which takes something about one second. def not_so_long_operation(): return fib(30) class arange: def __init__(self, n): self.n = n self.i = 0 async def __aiter__(self): return self async def __anext__(self): i = self.i self.i += 1 if self.i <= self.n:

Python aiohttp request stopped but raised no exception

非 Y 不嫁゛ 提交于 2020-01-03 12:13:06
问题 I use aiohttp to request the url. Most of the time it runs normally, but sometimes it stops without raising any exception. As you can see in the code, I catch all the exceptions, but when it stops no log of exceptions is printed. The logs look like: get_live_league_games: while True try yield from aiohttp.request but the ' res = yield from r.json() ' does not print, it stops and does not throw any exceptions. while True: print('get_live_league_games: while True') start = time.clock() try:

How to pass additional parameters to handle_client coroutine?

∥☆過路亽.° 提交于 2019-12-31 01:46:07
问题 The recommended way to use asyncio for a socket server is: import asyncio async def handle_client(reader, writer): request = (await reader.read(100)).decode() response = "Data received." writer.write(response.encode()) async def main(): loop.create_task(asyncio.start_server(handle_client, 'localhost', 15555)) loop = asyncio.get_event_loop() loop.create_task(main()) loop.run_forever() This works fine, but now I need to receive appropriate client request and then use aiohttp library to fetch

python 3.5 asyncio and aiohttp Errno 101 Network is unreachable

天大地大妈咪最大 提交于 2019-12-30 02:11:48
问题 I am using python 3.5 on Ubuntu 16. I am trying to use aiohttp to write a simple client. Here is the code I have. I took it from here. It's the first code sample, with ssl check disabled: import aiohttp import asyncio import async_timeout async def fetch(session, url): with async_timeout.timeout(10): async with session.get(url) as response: return await response.text() async def main(loop): conn = aiohttp.TCPConnector(verify_ssl=False) async with aiohttp.ClientSession(loop=loop, connector

Fetching multiple urls with aiohttp in Python 3.5

会有一股神秘感。 提交于 2019-12-29 03:30:14
问题 Since Python 3.5 introduced async with the syntax recommended in the docs for aiohttp has changed. Now to get a single url they suggest: import aiohttp import asyncio async def fetch(session, url): with aiohttp.Timeout(10): async with session.get(url) as response: return await response.text() if __name__ == '__main__': loop = asyncio.get_event_loop() with aiohttp.ClientSession(loop=loop) as session: html = loop.run_until_complete( fetch(session, 'http://python.org')) print(html) How can I

asyncio/aiohttp not returning response

我只是一个虾纸丫 提交于 2019-12-25 02:52:58
问题 I am trying to scrape some data from https://www.officialcharts.com/ by parallelising web requests using asyncio/aiohttp. I implemented the code given at the link here. I followed two different procedures. The first one goes like this. from bs4 import BeautifulSoup from urllib.request import urlopen from selenium import webdriver import time import pandas as pd import numpy as np import re import json import requests from bs4 import BeautifulSoup from datetime import date, timedelta from

Why use explicit loop parameter with aiohttp?

旧时模样 提交于 2019-12-24 09:18:00
问题 The aiohttp library's documentation states: loop – event loop used for processing HTTP requests. If param is None, asyncio.get_event_loop() is used for getting default event loop, but we strongly recommend to use explicit loops everywhere. (optional) It is possible to pass the loop to ClientSession objects, to provided "module-level" functions etc. I am new to the asynchronous programming concept as a whole, could you explain to me why it's recommended to explicitely provide the loop to use,

Python asynchronous REST API with responses which rely on CPU intensive calculations. How to handle efficiently? [duplicate]

喜欢而已 提交于 2019-12-23 19:10:54
问题 This question already has an answer here : How does the asyncio module work, why is my updated sample running synchronously? (1 answer) Closed last year . I have written a basic REST API using aiohttp, a simplified version of which is included below to illustrate the problem I am looking to solve. The API has two endpoints - each of which calls a function that performs some calculations. The difference between the two is that for one of the endpoints, the calculations take 10 seconds, and for