I want to use scrapy for crawling web pages. Is there a way to pass the start URL from the terminal itself?
It is given in the documentation that either the name of
An even easier way to allow multiple url-arguments than what Peter suggested is by giving them as a string with the urls separated by a comma, like this:
-a start_urls="http://example1.com,http://example2.com"
In the spider you would then simply split the string on ',' and get an array of urls:
self.start_urls = kwargs.get('start_urls').split(',')