Python urllib2.urlopen() is slow, need a better way to read several urls
问题 As the title suggests, I'm working on a site written in python and it makes several calls to the urllib2 module to read websites. I then parse them with BeautifulSoup. As I have to read 5-10 sites, the page takes a while to load. I'm just wondering if there's a way to read the sites all at once? Or anytricks to make it faster, like should I close the urllib2.urlopen after each read, or keep it open? Added : also, if I were to just switch over to php, would that be faster for fetching and