I have 20 spiders in one project, each spider has different task and URL to crawl ( but data are similar and I\'m using shared items.py and pipelines.py>
items.py
pipelines.py>
OK then you can use CloseSpider exception.
from scrapy.exceptions import CloseSpider # condition raise CloseSpider("message")