我的环境: celery 3.1.25 python 3.6.9 window10
celery tasks 代码如下,其中 QuotesSpider 是我的scrapy项目爬虫类名称
from celery_app import appfrom scrapy.crawler import CrawlerProcessfrom scrapy.utils.project import get_project_settingsfrom tutorial.spiders.quotes import QuotesSpider
def crawl_run(): scope = 'all' process = CrawlerProcess(settings=get_project_settings()) process.crawl(QuotesSpider, scope) process.start() process.join()@app.task(queue='default')def execute_task(): return crawl_run()
来源:博客园
作者:liuxianglong
链接:https://www.cnblogs.com/WalkOnMars/p/11558560.html