I\'ve been searching the scrapy documentation for a way to limit the number of requests my spiders are allowed to make. During development I don\'t want to sit here and wait
You are looking for the CLOSESPIDER_PAGECOUNT setting of the CloseSpider extension:
An integer which specifies the maximum number of responses to crawl. If the spider crawls more than that, the spider will be closed with the reason
closespider_pagecount. If zero (or non set), spiders won’t be closed by number of crawled responses.