How can I make scrapy crawl break and exit when encountering the first exception?

前端 未结 3 913
伪装坚强ぢ
伪装坚强ぢ 2020-12-14 02:20

For development purposes, I would like to stop all scrapy crawling activity as soon a first exception (in a spider or a pipeline) occurs.

Any advice?

3条回答
  •  一整个雨季
    2020-12-14 02:49

    its purely depends on your business logic. but this will work for you

    crawler.engine.close_spider(self, 'log message')
    

    Suggested Reading

    Suggested Reading

    and the worst solution is

    import sys
    
    sys.exit("SHUT DOWN EVERYTHING!")
    

提交回复
热议问题