Python,multi-threads,fetch webpages,download webpages
问题 I want to batch dowload webpages in one site. There are 5000000 urls links in my 'urls.txt' file. It's about 300M. How make a multi-threads link these urls and dowload these webpages? or How batch dowload these webpages? my ideas: with open('urls.txt','r') as f: for el in f: ##fetch these urls or twisted? Is there a good solution for it? 回答1: If this isn't part of a larger program, then notnoop's idea of using some existing tool to accomplish this is a pretty good one. If a shell loop