I have 20 spiders in one project, each spider has different task and URL to crawl ( but data are similar and I\'m using shared items.py and pipelines.py>
items.py
pipelines.py>
If you want to stop a spider from a pipeline, you can call the close_spider() function of the engine.
close_spider()
class MongoDBPipeline(object): def process_item(self, item, spider): spider.crawler.engine.close_spider(self, reason='finished')