Pass Scrapy Spider a list of URLs to crawl via .txt file

前端 未结 4 662
无人及你
无人及你 2020-12-24 11:16

I\'m a little new to Python and very new to Scrapy.

I\'ve set up a spider to crawl and extract all the information I need. However, I need to pass a .txt file of U

4条回答
  •  天涯浪人
    2020-12-24 11:42

    If your urls are line seperated

    def get_urls(filename):
            f = open(filename).read().split()
            urls = []
            for i in f:
                    urls.append(i)
            return urls 
    

    then this lines of code will give you the urls.

提交回复
热议问题