Scrapy LinkExtractor - Limit the number of pages crawled per URL
问题 I am trying to limit the number of crawled pages per URL in a CrawlSpider in Scrapy. I have a list of start_urls and I want to set a limit on the numbers pages are being crawled in each URL. Once the limit is reached, the spider should move to the next start_url. I know there is the DEPTH_LIMIT parameter on setting but this is not what I am looking for. Any help will be useful. Here is the code I currently have: class MySpider(CrawlSpider): name = 'test' allowed_domains = domainvarwebsite