问题
I need to automate a download process from a site which requires the following:
- send an HTTP POST request containing your username and password
- I should get a cookie (probably containing a session ID)
- send an HTTP GET request for the file, sending my cookie details in the HTTP headers
Using wget now, I must first login with a password (open a session?):
wget --no-check-certificate -O /dev/null --save-cookies auth.rda_ucar_edu --post-data=email=name@domain.edu&passwd=5555&action=login https://rda.ucar.edu/cgi-bin/login
then, I retrieve the files I need:
wget --no-check-certificate -N --load-cookies auth.rda_ucar_edu http://rda.ucar.edu/data/ds608.0/3HRLY/1979/NARRflx_197901_0916.tar
Is there a nice way to do this in Python? I have tried many ways and have not gotten this to work. The following python code seems to log me in correctly. However, I believe I need to keep the session live while I download my data?
url = 'https://rda.ucar.edu/cgi-bin/login'
values = {'email': 'name@domain.edu', 'password': '5555', 'action': 'login'}
data = urllib.urlencode(values)
binary_data = data.encode('ascii')
req = urllib2.Request(url, binary_data)
response = urllib2.urlopen(req)
print response.read()
Have also tried this:
from requests import session
with session() as c:
c.post(url, values)
request = c.get('http://rda.ucar.edu/data/ds608.0/3HRLY/1979/NARRflx_197901_0108.tar')
Any suggestions will be helpful.
回答1:
You need to save your cookies.
Easier to just use a 3rd party lib like mechanize or scrapy though
来源:https://stackoverflow.com/questions/16091745/python-automating-a-wget-script-with-login-required