celery

Python+Celery: Chaining jobs?

ε祈祈猫儿з 提交于 2019-11-28 04:08:19
The Celery documentation suggests that it's a bad idea to have tasks wait on the results of other tasks… But the suggested solution (see “good” heading) leaves a something to be desired. Specifically, there's no clear way of getting the subtask's result back to the caller (also, it's kind of ugly). So, is there any way of “chaining” jobs, so the caller gets the result of the final job? Eg, to use the add example: >>> add3 = add.subtask(args=(3, )) >>> add.delay(1, 2, callback=add3).get() 6 Alternately, is it OK to return instances of Result? For example: @task def add(x, y, callback=None):

passing django request object to celery task

浪尽此生 提交于 2019-11-28 04:06:03
问题 I have a task in tasks.py like so: @app.task def location(request): .... I am trying to pass the request object directly from a few to task like so: def tag_location(request): tasks.location.delay(request) return JsonResponse({'response': 1}) I am getting an error that it can't be serialized i guess? How do I fix this? trouble is I have file upload objects as well .. its not all simple data types. 回答1: Because the request object contains references to things which aren't practical to

Celery Worker Database Connection Pooling

一世执手 提交于 2019-11-28 03:58:26
I am using Celery standalone (not within Django). I am planning to have one worker task type running on multiple physical machines. The task does the following Accept an XML document. Transform it. Make multiple database reads and writes. I'm using PostgreSQL, but this would apply equally to other store types that use connections. In the past, I've used a database connection pool to avoid creating a new database connection on every request or avoid keeping the connection open too long. However, since each Celery worker runs in a separate process, I'm not sure how they would actually be able to

Callback for celery apply_async

删除回忆录丶 提交于 2019-11-28 03:52:20
I use celery in my application to run periodic tasks. Let's see simple example below from myqueue import Queue @perodic_task(run_every=timedelta(minutes=1)) def process_queue(): queue = Queue() uid, questions = queue.pop() if uid is None: return job = group(do_stuff(q) for q in questions) job.apply_async() def do_stuff(question): try: ... except: ... raise As you can see in the example above, i use celery to run async task, but (since it's a queue) i need to do queue.fail(uid) in case of exception in do_stuff or queue.ack(uid) otherwise. In this situation it would be very clear and usefull to

celery报错笔记

|▌冷眼眸甩不掉的悲伤 提交于 2019-11-28 03:30:26
最近在django项目中使用celery时遇见的巨坑(泄愤): 在使用celery+redis进行异步任务的时候,有的任务能够执行成功,有的任务不能执行成功,错误的任务报错(notregistered): 重启redis服务无法解决。 解决办法:重新开启一个redis服务,即可解决(我的解决办法),猜测更换broker库即可,例:0 -> 2。 原因猜测: 出现这种问题,一般都是对celery任务代码或配置信息进行了相关修改(clery任务调用方式或传入参数的修改,celery的backend存放位置的修改等) 有可能是redis中存在相关持久化文件,记住了前面的任务和配置信息,使得redis(broker)在进行任务分发给worker的时候出现混乱,造成错误(notregistered,got an unexpected argument等等) 来源: https://www.cnblogs.com/peng-zhao/p/11389258.html

Retry Celery tasks with exponential back off

旧时模样 提交于 2019-11-28 03:18:45
For a task like this: from celery.decorators import task @task() def add(x, y): if not x or not y: raise Exception("test error") return self.wait_until_server_responds( if it throws an exception and I want to retry it from the daemon side, how can apply an exponential back off algorithm, i.e. after 2^2, 2^3,2^4 etc seconds? Also is the retry maintained from the server side, such that if the worker happens to get killed then next worker that spawns will take the retry task? asksol The task.request.retries attribute contains the number of tries so far, so you can use this to implement

Send log messages from all celery tasks to a single file

落爺英雄遲暮 提交于 2019-11-28 03:08:00
I'm wondering how to setup a more specific logging system. All my tasks use logger = logging.getLogger(__name__) as a module-wide logger. I want celery to log to "celeryd.log" and my tasks to "tasks.log" but I got no idea how to get this working. Using CELERYD_LOG_FILE from django-celery I can route all celeryd related log messages to celeryd.log but there is no trace of the log messages created in my tasks. Note: This answer is outdated as of Celery 3.0, where you now use get_task_logger() to get your per-task logger set up. Please see the Logging section of the What's new in Celery 3.0

How to use Flask-SQLAlchemy in a Celery task

為{幸葍}努か 提交于 2019-11-28 02:49:14
I recently switch to Celery 3.0. Before that I was using Flask-Celery in order to integrate Celery with Flask. Although it had many issues like hiding some powerful Celery functionalities but it allowed me to use the full context of Flask app and especially Flask-SQLAlchemy. In my background tasks I am processing data and the SQLAlchemy ORM to store the data. The maintainer of Flask-Celery has dropped support of the plugin. The plugin was pickling the Flask instance in the task so I could have full access to SQLAlchemy. I am trying to replicate this behavior in my tasks.py file but with no

Celery 服务搭建

此生再无相见时 提交于 2019-11-28 00:29:47
整个项目工程如下 __init__.py """ 注意点:python3.7 需要执行 pip install --upgrade https://github.com/celery/celery/tarball/master 否则会报 from . import async, base SyntaxError: invalid syntax celery -A __init__ worker --concurrency=5 -l INFO -Q celery,save_redis celery -A __init__ worker -l info -Q save_mongo cd /Users/admin/PycharmProjects/function_test/adminSyS/mq&&celery -A __init__ worker --concurrency=5 celery 启动 --autoscale=10,3 当worker不足时自动加3-10个 celery -A __init__ worker --concurrency=5 -l INFO -Q celery,save_redis2,save_redis --autoscale=10,3 supervisor 配置 [program:celery] directory = /data/app/adminSyS

celery

时光毁灭记忆、已成空白 提交于 2019-11-27 22:12:48
Celery是一个异步任务的调度工具。 详细介绍以后补充,直接上代码和使用方法。如下: 结构目录如下: celery_homedir——————————————————主目录——————————————————1 ————celery_subdir———————————————子目录——————————————————2 ————————__init__.py—————————————子目录内初始化文件————————3——空文件 ————————celery_subtasks1.py—————异步任务1———————————————4 ————————celery_subtasks2.py—————异步任务2 ————————celery_subtasks3.py—————异步任务3 ...... ————————celery_subtasksn.py—————异步任务n ————__init__.py—————————————————celery主目录内初始化文件——5 ————celeryconfig.py—————————————celery配置文件——————————6 celery_homedir文件夹,包含整个celery全部配置的文件。 celery_subdir文件夹是二级目录,包含全部异步任务。 celery_subtasks1.py包含单个子任务全部代码。 #