celery

Problems with the new style celery api

醉酒当歌 提交于 2019-12-06 11:58:24
问题 I have a class that extends celerys Task . It runs just fine with the old style API, but I am having problems converting it to the new API. # In app/tasks.py from celery import Celery, Task celery = Celery() @celery.task class CustomTask(Task): def run(self, x): try: # do something except Exception, e: self.retry(args=[x], exc=e) And then I run the task like so - CustomTask().apply_async(args=[x], queue='q1') And I get the error - TypeError: run() takes exactly 2 arguments (1 given) This SO

Can i use luigi with Python celery

夙愿已清 提交于 2019-12-06 11:56:14
问题 I am using celery for my web application. Celery executes Parent tasks which then executes further pipline of tasks The issues with celery I can't get dependency graph and visualizer i get with luigi to see whats the status of my parent task Celery does not provide mechanism to restart the failed pipeline and start from where it failed. These two thing i can easily get from luigi. So i was thinking that once celery runs the parent task then inside that task i execute the Luigi pipleine. Is

Flask-Mail breaks Celery

强颜欢笑 提交于 2019-12-06 11:45:24
I've got a Flask app where celery works fine and Flask-Mail on its own works fine as well. from celery import Celery from flask_mail import Mail, Message app = Flask(__name__) mail = Mail(app) celery = Celery('main_app', broker='mongodb://localhost', backend='mongodb://localhost') @celery.task def cel_test(): return 'cel_test' @app.route('/works_maybe') def works_maybe(): return cel_test.delay() SO FAR, SO GOOD cel_test works fine with the celery worker; everything shows up in mongo. But here's where it gets weird. The "signup" plus mail method works 100% without @celery.task , but blows up

#sora#celery笔记——call the task

泄露秘密 提交于 2019-12-06 11:13:24
基本的两种task调用方式: apply_async()和delay(),前者要把参数放在元组或字典中,后者直接使用参数 快速参考: T.delay(arg, kwarg=value) always a shortcut to .apply_async. T.apply_async((arg, ), {'kwarg': value}) T.apply_async(countdown=10) executes 10 seconds from now. T.apply_async(eta=now + timedelta(seconds=10)) executes 10 seconds from now, specifed using eta T.apply_async(countdown=60, expires=120) executes in one minute from now, but expires after 2 minutes. T.apply_async(expires=now + timedelta(days=2)) expires in 2 days, set using datetime. 例子: task.delay(arg1, arg2, kwarg1='x', kwarg2='y') task.apply_async(args=[arg1, arg2],

#sora#celery worker guide abstract

你。 提交于 2019-12-06 11:13:10
celery worker guide abstract 启动worker: e.g. celery -A proj worker -l info celery -A proj worker --loglevel=INFO --concurrency=10 -n worker1.%h 备注: The hostname argument can expand the following variables: %h: Hostname including domain name. %n: Hostname only. %d: Domain name only. E.g. if the current hostname is george.example.com then these will expand to: worker1.%h -> worker1.george.example.com worker1.%n -> worker1.george worker1.%d -> worker1.example.com 关闭worker: 如果遇到worker死循环无法退出,可以用此: ps auxww | grep 'celery worker' | awk '{print $2}' | xargs kill -9 要重启worker,最简单的方法便是用celery multi

Celery SQS + Duplication of tasks + SQS visibility timeout

試著忘記壹切 提交于 2019-12-06 11:06:01
问题 Most of my Celery tasks have ETA longer then maximal visibility timeout defined by Amazon SQS. Celery documentation says: This causes problems with ETA/countdown/retry tasks where the time to execute exceeds the visibility timeout; in fact if that happens it will be executed again, and again in a loop. So you have to increase the visibility timeout to match the time of the longest ETA you’re planning to use. At the same time it also says that: The maximum visibility timeout supported by AWS

Mixing django-celery and standalone celery

浪子不回头ぞ 提交于 2019-12-06 10:49:25
问题 We are running a website built with Django and Piston and I want to implement celery to offload tasks to an external server. I don't really want to run Django on the secondary server and would like to simply run a pure Python celery worker. Is it possible for me to write simple function stubs on the Django server and write the actual function logic on the secondary server? i.e. Django Side from celery import task @task send_message(fromUser=None, toUser=None, msgType=None, msg=None): pass

python定时任务:apscheduler的使用(还有一个celery~)

徘徊边缘 提交于 2019-12-06 10:44:36
文章摘自: https://www.cnblogs.com/luxiaojun/p/6567132.html 1 . 安装 pip install apscheduler 2 . 简单例子 # coding:utf-8 from apscheduler.schedulers.blocking import BlockingScheduler import datetime def aps_test(): print(datetime.datetime.now().strftime('%Y-%m-%d %H:%M:%S'), '你好') scheduler = BlockingScheduler() scheduler.add_job(func=aps_test, trigger='cron', second='*/5') scheduler.start() 操作作业 上面是通过add_job()来添加作业,另外还有一种方式是通过scheduled_job()修饰器来修饰函数 import time from apscheduler.schedulers.blocking import BlockingScheduler sched = BlockingScheduler() @sched.scheduled_job('interval', seconds=5) def my_job

celery的使用

爱⌒轻易说出口 提交于 2019-12-06 10:44:22
Celery 官方 Celery 官网:http://www.celeryproject.org/ Celery 官方文档英文版:http://docs.celeryproject.org/en/latest/index.html Celery 官方文档中文版:http://docs.jinkan.org/docs/celery/ Celery架构 Celery的架构由三部分组成,消息中间件(message broker)、任务执行单元(worker)和 任务执行结果存储(task result store)组成。 消息中间件 Celery本身不提供消息服务,但是可以方便的和第三方提供的消息中间件集成。包括,RabbitMQ, Redis等等 任务执行单元 Worker是Celery提供的任务执行的单元,worker并发的运行在分布式的系统节点中。 任务结果存储 Task result store用来存储Worker执行的任务的结果,Celery支持以不同方式存储任务的结果,包括AMQP, redis等 使用场景 异步任务:将耗时操作任务提交给Celery去异步执行,比如发送短信/邮件、消息推送、音视频处理等等 定时任务:定时执行某件事情,比如每天数据统计 Celery的安装配置 pip install celery 消息中间件:RabbitMQ/Redis app=Celery(

Celery详解(2)

元气小坏坏 提交于 2019-12-06 08:31:27
  除了redis,还可以使用另外一个神器----Celery。Celery是一个异步任务的调度工具。   Celery是Distributed Task Queue,分布式任务队列,分布式决定了可以有多个worker的存在,列表表示其是异步操作,即存在一个产生任务提出需求的工头,和一群等着被分配工作的码农。   在python中定义Celery的时候,我们要引入Broker,中文翻译过来就是"中间人"的意思,在这里Broker起到一个中间人的角色,在工头提出任务的时候,把所有的任务放到Broker里面,在Broker的另一头,一群码农等着取出一个个任务准备着手做。   这种模式注定了整个系统会是个开环系统,工头对于码农们把任务做的怎样是不知情的,所以我们要引入Backend来保存每次任务的结果。这个Backend有点像我们的Broker,也是存储信息用的,只不过这里存的是那些任务的返回结果。我们可以选择只让错误执行的任务返回结果到Backend,这样我们取回结果,便可以知道有多少任务执行失败了。 Celery 介绍 在Celery中几个基本的概念,需要先了解下,不然不知道为什么要安装下面的东西。概念:Broker,Backend。 Broker:   broker是一个消息传输的中间件,可以理解为一个邮箱。每当应用程序调用celery的异步任务的时候,会向broker传递消息