Proxy Check in python

后端 未结 4 454
既然无缘
既然无缘 2020-12-23 14:55

I have written a script in python that uses cookies and POST/GET. I also included proxy support in my script. However, when one enters a dead proxy, the script crashes. Is t

4条回答
  •  挽巷
    挽巷 (楼主)
    2020-12-23 15:54

    The simplest was is to simply catch the IOError exception from urllib:

    try:
        urllib.urlopen(
            "http://example.com",
            proxies={'http':'http://example.com:8080'}
        )
    except IOError:
        print "Connection error! (Check proxy)"
    else:
        print "All was fine"
    

    Also, from this blog post - "check status proxy address" (with some slight improvements):

    for python 2

    import urllib2
    import socket
    
    def is_bad_proxy(pip):    
        try:
            proxy_handler = urllib2.ProxyHandler({'http': pip})
            opener = urllib2.build_opener(proxy_handler)
            opener.addheaders = [('User-agent', 'Mozilla/5.0')]
            urllib2.install_opener(opener)
            req=urllib2.Request('http://www.example.com')  # change the URL to test here
            sock=urllib2.urlopen(req)
        except urllib2.HTTPError, e:
            print 'Error code: ', e.code
            return e.code
        except Exception, detail:
            print "ERROR:", detail
            return True
        return False
    
    def main():
        socket.setdefaulttimeout(120)
    
        # two sample proxy IPs
        proxyList = ['125.76.226.9:80', '213.55.87.162:6588']
    
        for currentProxy in proxyList:
            if is_bad_proxy(currentProxy):
                print "Bad Proxy %s" % (currentProxy)
            else:
                print "%s is working" % (currentProxy)
    
    if __name__ == '__main__':
        main()
    

    for python 3

    import urllib.request
    import socket
    import urllib.error
    
    def is_bad_proxy(pip):    
        try:
            proxy_handler = urllib.request.ProxyHandler({'http': pip})
            opener = urllib.request.build_opener(proxy_handler)
            opener.addheaders = [('User-agent', 'Mozilla/5.0')]
            urllib.request.install_opener(opener)
            req=urllib.request.Request('http://www.example.com')  # change the URL to test here
            sock=urllib.request.urlopen(req)
        except urllib.error.HTTPError as e:
            print('Error code: ', e.code)
            return e.code
        except Exception as detail:
            print("ERROR:", detail)
            return True
        return False
    
    def main():
        socket.setdefaulttimeout(120)
    
        # two sample proxy IPs
        proxyList = ['125.76.226.9:80', '25.176.126.9:80']
    
        for currentProxy in proxyList:
            if is_bad_proxy(currentProxy):
                print("Bad Proxy %s" % (currentProxy))
            else:
                print("%s is working" % (currentProxy))
    
    if __name__ == '__main__':
        main() 
    

    Remember this could double the time the script takes, if the proxy is down (as you will have to wait for two connection-timeouts).. Unless you specifically have to know the proxy is at fault, handling the IOError is far cleaner, simpler and quicker..

提交回复
热议问题