问题
I need to send the crawler stats to a URL which is passed on as a spider argument. I need to make a POST request at regular intervals of 5 minutes. How can I do that?
回答1:
You will probably want to write an extension that simply makes a post request every 5 minutes.
You can make these requests either using scrapy's own mechanisms (e.g. engine.download()), or you can use a different async http client (e.g. treq)
If you're not sure how to structure your extension, you can take a look at logstats.py which does a similar thing, except not logging over http.
Since you're writing an extension anyway, I'd recommend making the url and interval settings, but that choice is up to you.
来源:https://stackoverflow.com/questions/54707698/scrapy-send-stats-to-a-url-passed-as-argument-as-a-post-request-every-5-minutes