celery-task

How do I override the backend for celery tasks

☆樱花仙子☆ 提交于 2020-02-03 17:46:07
问题 we're using Redis as our result backend. However for one task we'd like to override this to use RabbitMQ instead. The documentation for Task.backend says: The result store backend to use for this task. Defaults to the CELERY_RESULT_BACKEND setting So I had assumed that we could set Task.backend to a string of the same format accepted by CELERY_RESULT_BACKEND . So I try this: celeryconfig.py CELERY_RESULT_BACKEND = "redis://redis-host:7777" tasks.py @app.task(backend='amqp://guest@localhost

In celery, what is the appropriate way to pass contextual metadata from sender process to worker when a task is enqueued?

六月ゝ 毕业季﹏ 提交于 2020-01-24 09:40:14
问题 When any celery task is enqueued I want to add contextual metadata the worker will be able to use. The following code example works but I would like to have an appropriate celery-style solution. from celery.signals import before_task_publish, task_prerun @before_task_publish.connect def receiver_before_task_publish(sender=None, headers=None, body=None, **kwargs): task_kwags = body[1] metadata = {"foo": "bar"} task_kwags['__metadata__'] = metadata @task_prerun.connect def receiver_task_pre_run

Celery: list all tasks, scheduled, active *and* finished

匆匆过客 提交于 2020-01-22 05:40:26
问题 Update for the bounty I'd like a solution that does not involve a monitoring thread, if possible. I know I can view scheduled and active tasks using the Inspect class of my apps Control . i = myapp.control.inspect() currently_running = i.active() scheduled = i.scheduled() But I could not find any function to show already finished tasks. I know that this information mus be at least temporarily accessible, because I can look up a finished task by its task_id : >>> r = my task.AsyncResult(task

Is there a way to receive a notification as soon as a certain task with a certain task id is successful or fails using Celery for Python?

点点圈 提交于 2020-01-15 20:00:55
问题 I want to know if there is a way to monitor whether or not a task completes or fails as soon as it does using python celery. I have an event I want to fire up based on the results of a certain task. 回答1: You can run your task as a celery @shared_task with a try except block inside: @shared_task def my_task(input1, input2, ...): Setting up... try: Do stuff fire_success_event() <- Your success event except Exception: The above stuff failed fire_fail_event() <- your fail event return 1 <- fail

How to make two tasks mutually exclusive in Celery?

♀尐吖头ヾ 提交于 2020-01-03 05:07:12
问题 Is there a way to disallow two different tasks to run simultaneously in Celery? I was thinking about defining a new queue with concurrency level=1, and send those tasks to that queue, but I couldn't find an example. Is that possible? Thanks! 回答1: Yes, if you don't need to worry about overall throughput it is possible to create a separate queue and have a dedicated worker with concurrency set to 1. You can create as many queues as you want and configure which of those queues each worker

Retrieve result from 'task_id' in Celery from unknown task

我与影子孤独终老i 提交于 2020-01-01 09:08:39
问题 How do I pull the result of a task if I do not know previously which task was performed? Here's the setup: Given the following source('tasks.py'): from celery import Celery app = Celery('tasks', backend="db+mysql://u:p@localhost/db", broker = 'amqp://guest:guest@localhost:5672//') @app.task def add(x,y): return x + y @app.task def mul(x,y): return x * y with RabbitMQ 3.3.2 running locally: marcs-mbp:sbin marcstreeter$ ./rabbitmq-server RabbitMQ 3.3.2. Copyright (C) 2007-2014 GoPivotal, Inc. #

Celery AttributeError: async error

一曲冷凌霜 提交于 2019-12-31 08:54:24
问题 I have RabbitMQ and Celery running locally on my Mac (OS/X 10.13.4), the following code works locally when I run add.delay(x,y): #!/usr/bin/env python from celery import Celery from celery.utils.log import get_task_logger logger = get_task_logger(__name__) app = Celery('tasks', \ broker='pyamqp://appuser:xx@c2/appvhost', \ backend='db+mysql://appuser:xx@c2/pigpen') @app.task(bind=True) def dump_context(self, x, y): print('Executing task id {0.id}, args: {0.args!r} kwargs {0.kwargs!r}'.format

Celery AttributeError: async error

扶醉桌前 提交于 2019-12-31 08:53:48
问题 I have RabbitMQ and Celery running locally on my Mac (OS/X 10.13.4), the following code works locally when I run add.delay(x,y): #!/usr/bin/env python from celery import Celery from celery.utils.log import get_task_logger logger = get_task_logger(__name__) app = Celery('tasks', \ broker='pyamqp://appuser:xx@c2/appvhost', \ backend='db+mysql://appuser:xx@c2/pigpen') @app.task(bind=True) def dump_context(self, x, y): print('Executing task id {0.id}, args: {0.args!r} kwargs {0.kwargs!r}'.format

Cannot start Celery Worker (Kombu.asynchronous.timer)

与世无争的帅哥 提交于 2019-12-22 05:38:28
问题 I followed the first steps with Celery (Django) and trying to run a heavy process in the background. I have RabbitMQ server installed. However, when I try, celery -A my_app worker -l info it throws the following error File "<frozen importlib._bootstrap>", line 994, in _gcd_import File "<frozen importlib._bootstrap>", line 971, in _find_and_load File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked File "<frozen importlib._bootstrap>", line 665, in _load_unlocked File "

Retrieving GroupResult from taskset_id in Celery?

ぃ、小莉子 提交于 2019-12-18 15:47:50
问题 I am starting a set of celery tasks by using celery group as described in the official documentation I am also storing the group (taskset) id into a db, in order to poll celery for the taskset state. job = group([ single_test.s(1, 1), single_test.s(1, 2), single_test.s(1, 3), ]) result = job.apply_async() test_set = MyTestSet() test_set.taskset_id = result.id # store test_set into DB Is there a way to obtain a GroupResult object (i.e. my result ) starting from the taskset id? Something like