urllib2

How to send utf-8 content in a urllib2 request?

倖福魔咒の 提交于 2019-12-04 20:11:25
I'm struggling with the following question for the past half a day and although I've found some info about similar problems, nothing really hits the spot. I'm trying to send a PUT request using urllib2 with data that contains some Unicode characters: body = u'{ "bbb" : "asdf\xd7\xa9\xd7\x93\xd7\x92"}' conn = urllib2.Request(request_url, body, headers) conn.get_method = lambda: 'PUT' response = urllib2.urlopen(conn) I've tried to use body = body.encode('utf-8') and other variations, but whatever I do I get the following error: UnicodeEncodeError at ... 'ascii' codec can't decode byte 0xc3 in

Howto use python urllib2 to create service POST with bitbucket API?

試著忘記壹切 提交于 2019-12-04 18:30:53
While this code works fine to add a deployment ssh-key to my repos... print 'Connecting to Bitbucket...' bitbucket_access = base64.b64encode(userbb + ":" + passwordbb) bitbucket_headers = {"Content-Type":"application/json", "Authorization":"Basic " + bitbucket_access} bitbucket_request_url = "https://bitbucket.org/api/1.0/repositories/<username>/%s/deploy-keys" % project_name bitbucket_request_req = urllib2.Request(bitbucket_request_url) for key,value in bitbucket_headers.items(): bitbucket_request_req.add_header(key,value) request_data = json.dumps({"key":public_key, "label":subproject})

How to reliably process web-data in Python

拜拜、爱过 提交于 2019-12-04 18:25:22
I'm using the following code to get data from a website: time_out = 4 def tryconnect(turl, timer=time_out, retries=10): urlopener = None sitefound = 1 tried = 0 while (sitefound != 0) and tried < retries: try: urlopener = urllib2.urlopen(turl, None, timer) sitefound = 0 except urllib2.URLError: tried += 1 if urlopener: return urlopener else: return None [...] urlopener = tryconnect('www.example.com') if not urlopener: return None try: for line in urlopener: do stuff except httplib.IncompleteRead: print 'incomplete' return None except socket.timeout: print 'socket' return None return stuff Is

cURL: https through a proxy

南笙酒味 提交于 2019-12-04 17:59:16
I need to make a cURL request to a https URL, but I have to go through a proxy as well. Is there some problem with doing this? I have been having so much trouble doing this with curl and php, that I tried doing it with urllib2 in Python, only to find that urllib2 cannot POST to https when going through a proxy. I haven't been able to find any documentation to this effect with cURL, but I was wondering if anyone knew if this was an issue? I find testing with command-line curl a big help before moving to PHP/cURL. For example, w/ command-line, unless you've configured certificates, you'll need

EVENTVALIDATION error while scraping asp.net page

你离开我真会死。 提交于 2019-12-04 17:18:29
I need to get some values from this website . Basically I need to get the Area for every city. I am using Python and beautifulsoup for this. What I am doing is : First making a Get request to this page and getting __VIEWSTATE AND __EVENTVALIDATION to make a POST request to get cities for a particular state.Till here its working and I am getting cities for every states. To get Area I need to make another POST with new __VIEWSTATE AND __EVENTVALIDATION and this time i need to send city as well with other parameters.But I am getting error here: 505|error|500|Invalid postback or callback argument.

ImportError: No module named 'urllib2' Python 3 [duplicate]

房东的猫 提交于 2019-12-04 16:54:02
问题 This question already has answers here : Import error: No module name urllib2 (8 answers) Closed 3 years ago . The below code is working fine on Python 2 but on Python 3 I get the error: "ImportError: No module named 'urllib2'" import urllib2 peticion = 'I'm XML' url_test = 'I'm URL' req = urllib2.Request(url=url_test, data=peticion, headers={'Content-Type': 'application/xml'}) respuesta = urllib2.urlopen(req) print(respuesta) print(respuesta.read()) respuesta.open() Please suggest me the

How to send Multipart/related requests in Python to SOAP server?

风流意气都作罢 提交于 2019-12-04 16:02:39
I have to send a file to a SOAP server via a multipart/related HTTP POST. I have built the message from scratch like this: from email.mime.application import MIMEApplication from email.encoders import encode_7or8bit from email.mime.multipart import MIMEMultipart from email.mime.base import MIMEBase envelope = """<?xml version="1.0" encoding="UTF-8"?> <SOAP-ENV:Envelope xmlns:SOAP-ENV="http://www.w3.org/2003/05/soap-envelope" xmlns:SOAP-ENC="http://www.w3.org/2003/05/soap-encoding" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xop="http

Multithreading for faster downloading

青春壹個敷衍的年華 提交于 2019-12-04 15:57:45
问题 How can I download multiple links simultaneously? My script below works but only downloads one at a time and it is extremely slow. I can't figure out how to incorporate multithreading in my script. The Python script: from BeautifulSoup import BeautifulSoup import lxml.html as html import urlparse import os, sys import urllib2 import re print ("downloading and parsing Bibles...") root = html.parse(open('links.html')) for link in root.findall('//a'): url = link.get('href') name = urlparse

PIL / urllib2 - cannot identify image file when passing file using StringIO

扶醉桌前 提交于 2019-12-04 15:02:54
I'm downloading an image from the web using urllib2. Once I have downloaded it I want to do some stuff with it using an image module called PIL. I don't want to save the file to disk then reopen but rather pass it from memory using StringIO from PIL import Image image_buff = urllib2.urlopen(url) image = Image.open(StringIO.StringIO(image_buff)) However when I do this I get the following error IOError: cannot identify image file <StringIO.StringIO instance at 0x101afa2d8 I think this is because I'm not passing a string but rather a urllib2 object/instance. Would anyone know how I can pass a

Logging into quora using python

别来无恙 提交于 2019-12-04 14:59:52
I tried logging into quora using python. But it gives me the following error. urllib2.HTTPError: HTTP Error 500: Internal Server Error This is my code till now. I also work behind a proxy. import urllib2 import urllib import re import cookielib class Quora: def __init__(self): '''Initialising and authentication''' auth = 'http://name:password@proxy:port' cj = cookielib.CookieJar() logindata = urllib.urlencode({'email' : 'email' , 'password' : 'password'}) handler = urllib2.ProxyHandler({'http' : auth}) opener = urllib2.build_opener(handler , urllib2.HTTPCookieProcessor(cj)) urllib2.install