How to Speed Up Python's urllib2 when doing multiple requests

后端 未结 3 1512
深忆病人
深忆病人 2020-12-09 03:24

I am making several http requests to a particular host using python\'s urllib2 library. Each time a request is made a new tcp and http connection is created which takes a no

3条回答
  •  暖寄归人
    2020-12-09 04:15

    I've used the third-party urllib3 library to good effect in the past. It's designed to complement urllib2 by pooling connections for reuse.

    Modified example from the wiki:

    >>> from urllib3 import HTTPConnectionPool
    >>> # Create a connection pool for a specific host
    ... http_pool = HTTPConnectionPool('www.google.com')
    >>> # simple GET request, for example
    ... r = http_pool.urlopen('GET', '/')
    >>> print r.status, len(r.data)
    200 28050
    >>> r = http_pool.urlopen('GET', '/search?q=hello+world')
    >>> print r.status, len(r.data)
    200 79124
    

提交回复
热议问题