several concurrent URL calls

隐身守侯 提交于 2019-12-24 19:07:21

问题


How can I make, say N url calls in parallel, and process the responses as they come back?

I want to ready the responses and print them to the screen, maybe after some manipulations. I don't care about the order of the responses.


回答1:


You can use Twisted Python for this, such as in the example here: https://twistedmatrix.com/documents/13.0.0/web/howto/client.html#auto3

Twisted is an asynchronous programming library for Python which lets you carry out multiple actions "at the same time," and it comes with an HTTP client (and server).




回答2:


One basic solution that comes to mind is to use threading.

Depending of the number of URL you retrieve in parallel, you could have one thread per URL. Or (scale better), have a fixed number of "worker" threads, reading the URL from a shared Queue.



来源:https://stackoverflow.com/questions/17133921/several-concurrent-url-calls

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!