问题
My question is this; Is it possible to request two different URLs at the same time?
What I'm trying to do, is use a Python script to call requests from two different URLs at the same time. Making two PHP scripts run simultaneously (on different servers, running a terminal command). My issue is that I can't do them right after each other, because they each take a specific time to do something, and need to run at the same time, and end at the same time.
Is this possible using urllib2.urlopen? If so, how would I go about doing this?
If not, then what would be a good method to do so?
Currently I have something like:
import urllib2
...
if cmd.startswith('!command '):
cmdTime = cmd.replace('!command ','',1)
urllib2.urlopen('http://example.com/page.php?time='+cmdTime)
urllib2.urlopen('http://example2.com/page.php?time='+cmdTime)
print "Finished."
My issue is that they don't run at the same time.
If I did !command 60, then it'll run site.com for 60 seconds, then go to site2.com for 60 seconds and run that one.
回答1:
I would suggest you to create a function for getting the parsed source, where you should pass a list of url's to be crawled in a list as argument. Later on loop on the list of URL's and use threading. I will post some sample code for you, please modify it accordingly.
import threading
import urllib2
def execute_url_list(urls_list):
if cmd.startswith('!command '):
cmdTime = cmd.replace('!command ','',1)
for url in urls_list:
urllib2.urlopen(url+cmdTime)
urls_list = ['url1', 'url2']
processes = []
for k in urls_list:
process = threading.Thread(target=execute_url_list, args=[k])
process.setDaemon(True)
process.start()
processes.append(process)
for process in processes:
process.join()
回答2:
I think you can use multi-thread and send one request in one thread. In python, you can inherit threading.Thread class and override run method. You can start two thread and use synchronous in the two thread to make sure the two request is sent almost at the same.
But I think it can't not make sure the php script on the server will be exactly executed at the same time because the network time and system schedule time is not under your control.
来源:https://stackoverflow.com/questions/27978190/multiple-requests-using-urllib2-urlopen-at-the-same-time