Capture scrapy spider running status using an already defined decorator
问题 So I have a custom decorator called task that captures the status of a function. e.g., @task(task_name='tutorial', alert_name='tutorial') def start(): raw_data = download_data() data = parse(raw_data) push_to_db(data) if if __name__ == "__main__": start() So here the task decorator monitors the status of start function and send the error message to a central monitor system using alert_name if it fails otherwise sends successful message if it succeeds. Now I want to add this decorator to