How does the Requests library compare with the PyCurl performance wise?
My understanding is that Requests is a python wrapper for urllib whereas PyCurl is a python
First and foremost, requests is built on top of the urllib3 library, the stdlib urllib or urllib2 libraries are not used at all.
There is little point in comparing requests with pycurl on performance. pycurl may use C code for its work but like all network programming, your execution speed depends largely on the network that separates your machine from the target server. Moreover, the target server could be slow to respond.
In the end, requests has a far more friendly API to work with, and you'll find that you'll be more productive using that friendlier API.