celery

Celery

夙愿已清 提交于 2019-11-26 11:13:52
Celery 1.什么是Clelery Celery是一个简单、灵活且可靠的,处理大量消息的分布式系统 专注于实时处理的异步任务队列 同时也支持任务调度 Celery架构 Celery的架构由三部分组成,消息中间件(message broker),任务执行单元(worker)和任务执行结果存储(task result store)组成。 消息中间件 Celery本身不提供消息服务,但是可以方便的和第三方提供的消息中间件集成。包括,RabbitMQ, Redis等等 任务执行单元 Worker是Celery提供的任务执行的单元,worker并发的运行在分布式的系统节点中。 任务结果存储 Task result store用来存储Worker执行的任务的结果,Celery支持以不同方式存储任务的结果,包括AMQP, redis等 版本支持情况 Celery version 4.0 runs on Python ❨2.7, 3.4, 3.5❩ PyPy ❨5.4, 5.5❩ This is the last version to support Python 2.7, and from the next version (Celery 5.x) Python 3.5 or newer is required. If you’re running an older version of

Celery Received unregistered task of type (run example)

给你一囗甜甜゛ 提交于 2019-11-26 10:27:23
问题 I\'m trying to run example from Celery documentation. I run: celeryd --loglevel=INFO /usr/local/lib/python2.7/dist-packages/celery/loaders/default.py:64: NotConfigured: No \'celeryconfig\' module found! Please make sure it exists and is available to Python. \"is available to Python.\" % (configname, ))) [2012-03-19 04:26:34,899: WARNING/MainProcess] -------------- celery@ubuntu v2.5.1 ---- **** ----- --- * *** * -- [Configuration] -- * - **** --- . broker: amqp://guest@localhost:5672// - ** -

Running “unique” tasks with celery

末鹿安然 提交于 2019-11-26 09:18:39
问题 I use celery to update RSS feeds in my news aggregation site. I use one @task for each feed, and things seem to work nicely. There\'s a detail that I\'m not sure to handle well though: all feeds are updated once every minute with a @periodic_task, but what if a feed is still updating from the last periodic task when a new one is started ? (for example if the feed is really slow, or offline and the task is held in a retry loop) Currently I store tasks results and check their status like this:

Understanding celery task prefetching

旧巷老猫 提交于 2019-11-26 08:05:44
问题 I just found out about the configuration option CELERYD_PREFETCH_MULTIPLIER (docs). The default is 4, but (I believe) I want the prefetching off or as low as possible. I set it to 1 now, which is close enough to what I\'m looking for, but there\'s still some things I don\'t understand: Why is this prefetching a good idea? I don\'t really see a reason for it, unless there\'s a lot of latency between the message queue and the workers (in my case, they are currently running on the same host and

How to check task status in Celery?

吃可爱长大的小学妹 提交于 2019-11-26 08:01:31
问题 How does one check whether a task is running in celery (specifically, I\'m using celery-django)? I\'ve read the documentation, and I\'ve googled, but I can\'t see a call like: my_example_task.state() == RUNNING My use-case is that I have an external (java) service for transcoding. When I send a document to be transcoded, I want to check if the task that runs that service is running, and if not, to (re)start it. I\'m using the current stable versions - 2.4, I believe. 回答1: Return the task_id

Cancel an already executing task with Celery?

流过昼夜 提交于 2019-11-26 07:55:42
问题 I have been reading the doc and searching but cannot seem to find a straight answer: Can you cancel an already executing task? (as in the task has started, takes a while, and half way through it needs to be cancelled) I found this from the doc at Celery FAQ >>> result = add.apply_async(args=[2, 2], countdown=120) >>> result.revoke() But I am unclear if this will cancel queued tasks or if it will kill a running process on a worker. Thanks for any light you can shed! 回答1: revoke cancels the

celery执行延时任务

雨燕双飞 提交于 2019-11-26 05:37:29
由于项目需求,需要在指定时间之后执行异步任务给用户推送消息,由于之前只用过celery的定时任务,在查阅一番资料之后,发现有官方文档中是有相关说明的。 T . delay ( arg , kwargs = value ) 是常见的用来执行celery异步任务的命令。 而还有另一个命令是不常用的 T . apply_async ( ( arg , ) , { 'kwarg' : value } , countdown = 60 , expires = 120 ) 时可以用来执行延时任务的,其中countdown指定多少秒后执行,expires指定最长等待之间,即过期时间。 由于celery延时任务在超过执行时间后仍未执行,可能会发生重复执行的情况,所以最好指定expires避免此种情况的发生 来源: CSDN 作者: Galois1764 链接: https://blog.csdn.net/Galois1764/article/details/103241038

How do you run a worker with AWS Elastic Beanstalk?

依然范特西╮ 提交于 2019-11-26 04:37:12
问题 I am launching a django application on aws elastic beanstalk. I\'d like to run background task or worker in order order to run celery. I can not find if it is possible or not. If yes how could it be achieved? Here is what I am doing right now, but this is producing an event type error every time. container_commands: 01_syncdb: command: \"django-admin.py syncdb --noinput\" leader_only: true 50_sqs_email: command: \"./manage.py celery worker --loglevel=info\" leader_only: true 回答1: As @chris

How to dynamically add / remove periodic tasks to Celery (celerybeat)

你。 提交于 2019-11-26 03:51:56
问题 If I have a function defined as follows: def add(x,y): return x+y Is there a way to dynamically add this function as a celery PeriodicTask and kick it off at runtime? I\'d like to be able to do something like (pseudocode): some_unique_task_id = celery.beat.schedule_task(add, run_every=crontab(minute=\"*/30\")) celery.beat.start(some_unique_task_id) I would also want to stop or remove that task dynamically with something like (pseudocode): celery.beat.remove_task(some_unique_task_id) or celery

Reporting yielded results of long-running Celery task

北城以北 提交于 2019-11-26 02:58:19
问题 Problem I\'ve segmented a long-running task into logical subtasks, so I can report the results of each subtask as it completes. However, I\'m trying to report the results of a task that will effectively never complete (instead yielding values as it goes), and am struggling to do so with my existing solution. Background I\'m building a web interface to some Python programs I\'ve written. Users can submit jobs through web forms, then check back to see the job\'s progress. Let\'s say I have two