celery

How to check task status in Celery?

社会主义新天地 提交于 2019-11-26 19:40:03
How does one check whether a task is running in celery (specifically, I'm using celery-django)? I've read the documentation, and I've googled, but I can't see a call like: my_example_task.state() == RUNNING My use-case is that I have an external (java) service for transcoding. When I send a document to be transcoded, I want to check if the task that runs that service is running, and if not, to (re)start it. I'm using the current stable versions - 2.4, I believe. Return the task_id (which is given from .delay()) and ask the celery instance afterwards about the state: x = method.delay(1,2) print

How to chain a Celery task that returns a list into a group?

落花浮王杯 提交于 2019-11-26 19:20:36
问题 I want to create a group from a list returned by a Celery task, so that for each item in the task result set, one task will be added to the group. Here's a simple code example to explain the use case. The ??? should be the result from the previous task. @celery.task def get_list(amount): # In reality, fetch a list of items from a db return [i for i in range(amount)] @celery.task def process_item(item): #do stuff pass process_list = (get_list.s(10) | group(process_item.s(i) for i in ???)) I'm

How can I recover unacknowledged AMQP messages from other channels than my connection's own?

社会主义新天地 提交于 2019-11-26 19:14:25
问题 It seems the longer I keep my rabbitmq server running, the more trouble I have with unacknowledged messages. I would love to requeue them. In fact there seems to be an amqp command to do this, but it only applies to the channel that your connection is using. I built a little pika script to at least try it out, but I am either missing something or it cannot be done this way (how about with rabbitmqctl?) import pika credentials = pika.PlainCredentials('***', '***') parameters = pika

Distributed task queues (Ex. Celery) vs crontab scripts

只谈情不闲聊 提交于 2019-11-26 18:48:30
问题 I'm having trouble understanding the purpose of 'distributed task queues'. For example, python's celery library. I know that in celery, the python framework, you can set timed windows for functions to get executed. However, that can also be easily done in a linux crontab directed at a python script. And as far as I know, and shown from my own django-celery webapps, celery consumes much more RAM memory than just setting up a raw crontab. Few hundred MB difference for a relatively small app.

Celery

南笙酒味 提交于 2019-11-26 17:28:28
什么是Clelery Celery是一个简单、灵活且可靠的,处理大量消息的分布式系统 专注于实时处理的异步任务队列 同时也支持任务调度 celery能做什么 异步任务 定时任务 Celery架构 Celery的架构由三部分组成,消息中间件(message broker),任务执行单元(worker)和任务执行结果存储(task result store)组成 消息中间件 Celery本身不提供消息服务,但是可以方便的和第三方提供的消息中间件集成。包括,RabbitMQ, Redis等等 任务执行单元 Worker是Celery提供的任务执行的单元,worker并发的运行在分布式的系统节点中。 任务结果存储 Task result store用来存储Worker执行的任务的结果,Celery支持以不同方式存储任务的结果,包括AMQP, redis等 版本支持情况 Celery version 4.0 runs on Python ❨2.7, 3.4, 3.5❩ PyPy ❨5.4, 5.5❩ This is the last version to support Python 2.7, and from the next version (Celery 5.x) Python 3.5 or newer is required. If you’re running an older

Celery 'Getting Started' not able to retrieve results; always pending

北城余情 提交于 2019-11-26 16:29:44
问题 I've been trying to follow the Celery First Steps With Celery and Next Steps guides. My setup is Windows 7 64-bit, Anaconda Python 2.7 (32-bit), Installed Erlang 32-bit binaries, RabbitMQ server, and celery (with pip install celery ). Following the guide I created a proj folder with init .py, tasks.py, and celery.py. My init .py is empty. Here's celery.py: from __future__ import absolute_import from celery import Celery app = Celery('proj', broker='amqp://', backend='amqp://', include=['proj

How do you run a worker with AWS Elastic Beanstalk?

断了今生、忘了曾经 提交于 2019-11-26 15:48:42
I am launching a django application on aws elastic beanstalk. I'd like to run background task or worker in order order to run celery. I can not find if it is possible or not. If yes how could it be achieved? Here is what I am doing right now, but this is producing an event type error every time. container_commands: 01_syncdb: command: "django-admin.py syncdb --noinput" leader_only: true 50_sqs_email: command: "./manage.py celery worker --loglevel=info" leader_only: true yellowcap As @chris-wheadon suggested in his comment, you should try to run celery as a deamon in the background. AWS Elastic

【Python celery】 -- 2019-08-08 20:39:56

只愿长相守 提交于 2019-11-26 14:16:55
目录 原文: http://106.13.73.98/__/156/ 安装: pip install celery celery 是基于 Python 实现的模块,用于执行异步定时周期任务。 celery 组成结构: 用户任务 app : 用于生成任务 管道 broker 与 backend :前者用于存放任务,后者用于存放任务执行结果 员工 worker :负责执行任务 @(Python celery) 简单示例 员工文件(workers.py): import time from celery import Celery # 创建一个Celery实例,这个就是我们用户的应用app my_task = Celery( 'tasks', broker='redis://127.0.0.1:6380', # 指定存放任务的地方,这个指定为redis backend='redis://127.0.0.1:6380', # 指定存放任务执行结果的地方 ) # 为应用创建任务 @my_task.task def fn1(x, y): time.sleep(10) return x + y """ 执行命令: Linux:celery worker -A workers -l INFO Windows:celery worker -A workers -l INFO -P eventlet

Retrieve list of tasks in a queue in Celery

江枫思渺然 提交于 2019-11-26 12:18:35
问题 How can I retrieve a list of tasks in a queue that are yet to be processed? 回答1: EDIT: See other answers for getting a list of tasks in the queue. You should look here: Celery Guide - Inspecting Workers Basically this: >>> from celery.task.control import inspect # Inspect all nodes. >>> i = inspect() # Show the items that have an ETA or are scheduled for later processing >>> i.scheduled() # Show tasks that are currently active. >>> i.active() # Show tasks that have been claimed by workers >>>

Celery with Amazon SQS

懵懂的女人 提交于 2019-11-26 11:57:31
问题 I want to use Amazon SQS as broker backed of Celery. There’s the SQS transport implementation for Kombu, which Celery depends on. However there is not enough documentation for using it, so I cannot find how to configure SQS on Celery. Is there somebody that had succeeded to configure SQS on Celery? 回答1: I ran into this question several times but still wasn't entirely sure how to setup Celery to work with SQS. It turns out that it is quite easy with the latest versions of Kombu and Celery. As