celery

Retrieve task result by id in Celery

馋奶兔 提交于 2019-11-27 15:37:31
问题 I am trying to retreive the result of a task which has completed. This works from proj.tasks import add res = add.delay(3,4) res.get() 7 res.status 'SUCCESS' res.id '0d4b36e3-a503-45e4-9125-cfec0a7dca30' But I want to run this from another application. So I rerun python shell and try: from proj.tasks import add res = add.AsyncResult('0d4b36e3-a503-45e4-9125-cfec0a7dca30') res.status 'PENDING' res.get() # Error How can I retrieve the result? 回答1: It works using AsyncResult . (see this answer)

Realtime progress tracking of celery tasks

廉价感情. 提交于 2019-11-27 15:20:02
问题 I have a main celery task that starts multiple sub-tasks (thousands) doing multiple actions (same actions per sub-task). What i want is, from the main celery task to track in real-time for each action, how many are done and how many have failed for each sub-task. In summary! Main task: receive list of objects, and a list of actions to do for each object. For each object, a sub-task is started to perform the actions for the object. The main task is finished when all the sub-tasks are finished

Django and Celery - re-loading code into Celery after a change

帅比萌擦擦* 提交于 2019-11-27 14:47:54
问题 If I make a change to tasks.py while celery is running, is there a mechanism by which it can re-load the updated code? or do I have to shut Celery down a re-load? I read celery had an --autoreload argument in older versions, but I can't find it in the current version: celery: error: unrecognized arguments: --autoreload 回答1: Unfortunately --autoreload doesn't work and it is deprecated. You can use Watchdog which provides watchmedo a shell utilitiy to perform actions based on file events. pip

How to combine Celery with asyncio?

*爱你&永不变心* 提交于 2019-11-27 14:27:33
问题 How can I create a wrapper that makes celery tasks look like asyncio.Task ? Or is there a better way to integrate Celery with asyncio ? @asksol, the creator of Celery, said this:: It's quite common to use Celery as a distributed layer on top of async I/O frameworks (top tip: routing CPU-bound tasks to a prefork worker means they will not block your event loop). But I could not find any code examples specifically for asyncio framework. 回答1: That will be possible from Celery version 5.0 as

Celery: WorkerLostError: Worker exited prematurely: signal 9 (SIGKILL)

你说的曾经没有我的故事 提交于 2019-11-27 14:21:17
问题 I use Celery with RabbitMQ in my Django app (on Elastic Beanstalk) to manage background tasks and I daemonized it using Supervisor. The problem now, is that one of the period task that I defined is failing (after a week in which it worked properly), the error I've got is: [01/Apr/2014 23:04:03] [ERROR] [celery.worker.job:272] Task clean-dead-sessions[1bfb5a0a-7914-4623-8b5b-35fc68443d2e] raised unexpected: WorkerLostError('Worker exited prematurely: signal 9 (SIGKILL).',) Traceback (most

Celery 'Getting Started' not able to retrieve results; always pending

拟墨画扇 提交于 2019-11-27 14:12:32
I've been trying to follow the Celery First Steps With Celery and Next Steps guides. My setup is Windows 7 64-bit, Anaconda Python 2.7 (32-bit), Installed Erlang 32-bit binaries, RabbitMQ server, and celery (with pip install celery ). Following the guide I created a proj folder with init .py, tasks.py, and celery.py. My init .py is empty. Here's celery.py: from __future__ import absolute_import from celery import Celery app = Celery('proj', broker='amqp://', backend='amqp://', include=['proj.tasks']) #Optional configuration, see the application user guide app.conf.update( CELERY_TASK_RESULT

Concurrent asynchronous processes with Python, Flask and Celery

浪尽此生 提交于 2019-11-27 14:09:44
问题 I am working on a small but computationally-intensive Python app. The computationally-intensive work can be broken into several pieces that can be executed concurrently. I am trying to identify a suitable stack to accomplish this. Currently I am planning to use a Flask app on Apache2+WSGI with Celery for the task queue. In the following, will a_long_process() , another_long_process() and yet_another_long_process() execute concurrently if there are 3 or more workers available? Will the Flask

celery定时任务

点点圈 提交于 2019-11-27 13:08:13
celery介绍 Celery 是一个强大的分布式任务队列,主要包括三个部分消息中间件 Broker、任务执行单元 Worker、任务结果存储 Backend,它可以让任务的执行完全脱离主程序,甚至可以被分配到其他主机上运行。我们通常使用它来实现异步任务( async task )和定时任务( crontab )。 异步任务比如是发送邮件、或者文件上传, 图像处理等等一些比较耗时的操作 ,定时任务是需要在特定时间执行的任务。它的架构组成如下图: 任务队列 任务队列是一种跨线程、跨机器工作的一种机制.任务队列中包含称作任务的工作单元。有专门的工作进程持续不断的监视任务队列,并从中获得新的任务并处理. 任务模块 包含异步任务和定时任务。其中,异步任务通常在业务逻辑中被触发并发往任务队列,而定时任务由 Celery Beat 进程周期性地将任务发往任务队列。 消息中间件 Broker Broker ,即为任务调度队列,接收任务生产者发来的消息(即任务),将任务存入队列。 Celery 本身不提供队列服务,官方推荐使用 RabbitMQ 和 Redis 等。 任务执行单元 Worker Worker 是执行任务的处理单元,它实时监控消息队列,获取队列中调度的任务,并执行它。 任务结果存储 Backend Backend 用于存储任务的执行结果,以供查询。同消息中间件一样,存储也可使用

Temporary queue made in Celery

纵然是瞬间 提交于 2019-11-27 12:57:40
问题 I am using Celery with RabbitMQ. Lately, I have noticed that a large number of temporary queues are getting made. So, I experimented and found that when a task fails (that is a tasks raises an Exception), then a temporary queue with a random name (like c76861943b0a4f3aaa6a99a6db06952c) is formed and the queue remains. Some properties of the temporary queue as found in rabbitmqadmin are as follows - auto_delete : True consumers : 0 durable : False messages : 1 messages_ready : 1 And one such

【Python celery】 -- 2019-08-16 13:26:18

邮差的信 提交于 2019-11-27 12:38:51
目录 原文: http://blog.gqylpy.com/gqy/380 "安装: pip install celery celery 是基于 Python 实现的模块,用于执行异步定时周期任务。 celery 组成结构: 用户任务 app : 用于生成任务 管道 broker 与 backend :前者用于存放任务,后者用于存放任务执行结果 员工 worker :负责执行任务 @(Python celery) 简单示例 员工文件(workers.py): import time from celery import Celery # 创建一个Celery实例,这个就是我们用户的应用app my_task = Celery( 'tasks', broker='redis://127.0.0.1:6380', # 指定存放任务的地方,这个指定为redis backend='redis://127.0.0.1:6380', # 指定存放任务执行结果的地方 ) # 为应用创建任务 @my_task.task def fn1(x, y): time.sleep(10) return x + y """ 执行命令: Linux:celery worker -A workers -l INFO Windows:celery worker -A workers -l INFO -P