I\'ve got a python web crawler and I want to distribute the download requests among many different proxy servers, probably running squid (though I\'m open to alternatives). For
Edit: There is even Python wrapper for gimmeproxy: https://github.com/ericfourrier/gimmeproxy-api
If you don't mind Node, you can use proxy-lists to collect public proxies and check-proxy to check them. It's exactly how https://gimmeproxy.com works, more info here