Multiple (asynchronous) connections with urllib2 or other http library?

前端 未结 6 1022
耶瑟儿~
耶瑟儿~ 2020-11-29 04:43

I have code like this.

for p in range(1,1000):
    result = False
    while result is False:
        ret = urllib2.Request(\'http://server/?\'+str(p))
               


        
6条回答
  •  栀梦
    栀梦 (楼主)
    2020-11-29 05:30

    Take a look at gevent — a coroutine-based Python networking library that uses greenlet to provide a high-level synchronous API on top of libevent event loop.

    Example:

    #!/usr/bin/python
    # Copyright (c) 2009 Denis Bilenko. See LICENSE for details.
    
    """Spawn multiple workers and wait for them to complete"""
    
    urls = ['http://www.google.com', 'http://www.yandex.ru', 'http://www.python.org']
    
    import gevent
    from gevent import monkey
    
    # patches stdlib (including socket and ssl modules) to cooperate with other greenlets
    monkey.patch_all()
    
    import urllib2
    
    
    def print_head(url):
        print 'Starting %s' % url
        data = urllib2.urlopen(url).read()
        print '%s: %s bytes: %r' % (url, len(data), data[:50])
    
    jobs = [gevent.spawn(print_head, url) for url in urls]
    
    gevent.joinall(jobs)
    

提交回复
热议问题