How to pass custom settings through CrawlerProcess in scrapy?
I have two CrawlerProcesses, each is calling different spider. I want to pass custom settings to one of these processes to save the output of the spider to csv, I thought I could do this: storage_settings = {'FEED_FORMAT': 'csv', 'FEED_URI': 'foo.csv'} process = CrawlerProcess(get_project_settings()) process.crawl('ABC', crawl_links=main_links, custom_settings=storage_settings ) process.start() and in my spider I read them as an argument: def __init__(self, crawl_links=None, allowed_domains=None, customom_settings=None, *args, **kwargs): self.start_urls = crawl_links self.allowed_domains =