celery

How can I schedule a Task to execute at a specific time using celery?

China☆狼群 提交于 2019-12-17 08:51:27
问题 I've looked into PeriodicTask , but the examples only cover making it recur. I'm looking for something more like cron 's ability to say "execute this task every Monday at 1 a.m." 回答1: The recently released version 1.0.3 supports this now, thanks to Patrick Altman! Example: from celery.task.schedules import crontab from celery.decorators import periodic_task @periodic_task(run_every=crontab(hour=7, minute=30, day_of_week="mon")) def every_monday_morning(): print("This runs every Monday morning

celery执行异步任务和定时任务

回眸只為那壹抹淺笑 提交于 2019-12-17 06:23:24
一、什么是Clelery Celery是一个简单、灵活且可靠的,处理大量消息的分布式系统 专注于实时处理的异步任务队列 同时也支持任务调度 Celery架构 Celery的架构由三部分组成,消息中间件(message broker),任务执行单元(worker)和任务执行结果存储(task result store)组成。 消息中间件 Celery本身不提供消息服务,但是可以方便的和第三方提供的消息中间件集成。包括,RabbitMQ, Redis等等 任务执行单元 Worker是Celery提供的任务执行的单元,worker并发的运行在分布式的系统节点中。 任务结果存储 Task result store用来存储Worker执行的任务的结果,Celery支持以不同方式存储任务的结果,包括AMQP, redis等 版本支持情况 Celery version 4.0 runs on Python ❨2.7, 3.4, 3.5❩ PyPy ❨5.4, 5.5❩ This is the last version to support Python 2.7, and from the next version (Celery 5.x) Python 3.5 or newer is required. If you’re running an older version of Python,

cerely-分布式异步任务队列

拥有回忆 提交于 2019-12-15 14:10:09
Celery 是一个强大的 分布式任务队列 的 异步处理框架,它可以让任务的执行完全脱离主程序,甚至可以被分配到其他主机上运行。我们通常使用它来实现异步任务(async task)和定时任务(crontab)。 在Celery中几个基本的概念,需要先了解下,不然不知道为什么要安装下面的东西。概念:Broker、Backend。 broker broker是一个消息传输的中间件或消息队列,可以理解为一个邮箱。 每当应用程序调用celery的异步任务的时候,会向broker传递消息,而后celery的worker将会取到消息,进行对于的程序执行。其中Broker的中文意思是 经纪人 ,其实就是一开始说的 消息队列 ,用来发送和接受消息。这个Broker有几个方案可供选择:RabbitMQ (消息队列), Redis (缓存数据库), 数据库 (不推荐),等等 backend 用于存储这些消息以及celery执行的一些消息和结果。 Backend是在Celery的配置中的一个配置项 CELERY_RESULT_BACKEND ,作用是保存结果和状态,如果你需要跟踪任务的状态,那么需要设置这一项,可以是Database backend,也可以是Cache backend,具体可以参考这里: CELERY_RESULT_BACKEND 。 brokers,官方推荐是 rabbitmq 和

start celery worker and enable it for broadcast queue

陌路散爱 提交于 2019-12-14 03:54:06
问题 I'm trying to start celery worker so it only listens to single queue. This is not a problem, I can do this that way: python -m celery worker -A my_module -Q my_queue -c 1 But now I also want this my_queue queue to be a broadcast queue, so I do this in my celeryconfig: from kombu.common import Broadcast CELERY_QUEUES = (Broadcast('my_queue'),) But as soon as I do this I cannot start my worker anymore, I get error from rabbitmq: amqp.exceptions.PreconditionFailed: Exchange.declare: (406)

Python—Celery 框架使用

ぐ巨炮叔叔 提交于 2019-12-14 02:30:51
二、Celery基本使用 1.创建一个celery application 用来定义你的任务列表,创建一个任务文件就叫tasks.py吧。 from celery import Celery # 配置好celery的backend和broker app = Celery('task1', backend='redis://127.0.0.1:6379/0', broker='redis://127.0.0.1:6379/0') #普通函数装饰为 celery task @app.task def add(x, y): return x + y 如此而来,我们只是定义好了任务函数func函数和worker(celery对象)。worker相当于工作者。 2.启动Celery Worker来开始监听并执行任务。broker 我们有了,backend 我们有了,task 我们也有了,现在就该运行 worker 进行工作了,在 tasks.py 所在目录下运行: [root@localhost ~]# celery -A tasks worker --loglevel=info # 启动方法1 [root@localhost ~]# celery -A tasks worker --l debug # 启动方法2 现在 tasks 这个任务集合的 worker 在进行工作

Decorator after @task decorator in celery

爷,独闯天下 提交于 2019-12-14 00:17:13
问题 I'm trying to apply a decorator after the celery @task decorator, something like. @send_email @task def any_function(): print "inside the function" I can get it to work in the way it is recommended in the docs, i.e. to put the decorator before the task decorator, but in this case I would like to access the task instance in my decorator. The @send_email would have to be a class decorator, this is what I tried without success: class send_email(object): ''' wraps a Task celery class ''' def _

How to get Task ID in celery django from the currently running Shared Task itself?

偶尔善良 提交于 2019-12-13 20:10:04
问题 In my views.py I am using celery to run a shared task present in tasks.py . Here is how I call from views.py task = task_addnums.delay() task_id = task.id tasks.py looks as from celery import shared_task from celery.result import AsyncResult @shared_task def task_addnums(): # print self.request.id # do something return True Now, as we can see we already have task_id from task.id in views.py . But, Let's say If I want to fetch task id from the shared_task itself how can I ? The goal is to get

Error message 'No handlers could be found for logger “multiprocessing”' using Celery

泄露秘密 提交于 2019-12-13 15:03:49
问题 RabbitMQ now seems to be working correctly. However, when I try python -m celery.bin.celeryd --loglevel=INFO` (regular celeryd doesn't work), I get the error No handlers could be found for logger "multiprocessing"`). Here's the full output (redacted slightly): [2011-06-06 02:08:08,105: WARNING/MainProcess] -------------- celery@blahblah v2.2.6 ---- **** ----- --- * *** * -- [Configuration] -- * - **** --- . broker: amqplib://blah@localhost:5672/vhost - ** ---------- . loader: celery.loaders

Celery: how to add a callback function when calling a remote task (with send_task)

偶尔善良 提交于 2019-12-13 14:43:13
问题 You can use celery to call a task by name, that is registered in a different process (or even on a different machine): celery.send_task(task_name, args=args, kwargs=kwargs) (http://celery.readthedocs.org/en/latest/reference/celery.html#celery.Celery.send_task) I now would like to be able to add a callback that will be executed as soon as the task finished and that will be executed within the process that is calling the task. My Setup I have a server A, that runs a django powered website and I