Passing arguments to process.crawl in Scrapy python

后端 未结 3 455
囚心锁ツ
囚心锁ツ 2020-12-14 06:33

I would like to get the same result as this command line : scrapy crawl linkedin_anonymous -a first=James -a last=Bond -o output.json

My script is as follows :

相关标签:
3条回答
  • 2020-12-14 07:05

    if you have Scrapyd and you want to schedule the spider, do this

    curl http://localhost:6800/schedule.json -d project=projectname -d spider=spidername -d first='James' -d last='Bond'

    0 讨论(0)
  • 2020-12-14 07:06

    pass the spider arguments on the process.crawl method:

    process.crawl(spider, input='inputargument', first='James', last='Bond')
    
    0 讨论(0)
  • 2020-12-14 07:16

    You can do it the easy way:

    from scrapy import cmdline
    
    cmdline.execute("scrapy crawl linkedin_anonymous -a first=James -a last=Bond -o output.json".split())
    
    0 讨论(0)
提交回复
热议问题