django-celery

Django Celery Logging Best Practice

半腔热情 提交于 2019-11-27 04:12:54
问题 I'm trying to get Celery logging working with Django . I have logging set-up in settings.py to go to console (that works fine as I'm hosting on Heroku ). At the top of each module, I have: import logging logger = logging.getLogger(__name__) And in my tasks.py, I have: from celery.utils.log import get_task_logger logger = get_task_logger(__name__) That works fine for logging calls from a task and I get output like this: 2012-11-13T18:05:38+00:00 app[worker.1]: [2012-11-13 18:05:38,527: INFO

Add n tasks to celery queue and wait for the results

社会主义新天地 提交于 2019-11-27 01:32:51
问题 I would add multiple tasks to celery queue and wait for results. I have various ideas how I would achieve this utilising some form of shared storage (memcached, redis, db, etc.), however, I would have thought it's something that Celery can handle automatically but I can't find any resources online. Code example def do_tasks(b): for a in b: c.delay(a) return c.all_results_some_how() 回答1: For Celery >= 3.0 , TaskSet is deprecated in favour of group. from celery import group from tasks import

Retry Lost or Failed Tasks (Celery, Django and RabbitMQ)

纵然是瞬间 提交于 2019-11-27 00:47:08
问题 Is there a way to determine if any task is lost and retry it? I think that the reason for lost can be dispatcher bug or worker thread crash. I was planning to retry them but I'm not sure how to determine which tasks need to be retired? And how to make this process automatically? Can I use my own custom scheduler which will create new tasks? Edit: I found from the documentation that RabbitMQ never loose tasks, but what happens when worker thread crash in the middle of task execution? 回答1: What

Retry Celery tasks with exponential back off

為{幸葍}努か 提交于 2019-11-26 23:59:56
问题 For a task like this: from celery.decorators import task @task() def add(x, y): if not x or not y: raise Exception("test error") return self.wait_until_server_responds( if it throws an exception and I want to retry it from the daemon side, how can apply an exponential back off algorithm, i.e. after 2^2, 2^3,2^4 etc seconds? Also is the retry maintained from the server side, such that if the worker happens to get killed then next worker that spawns will take the retry task? 回答1: The task

How to check task status in Celery?

社会主义新天地 提交于 2019-11-26 19:40:03
How does one check whether a task is running in celery (specifically, I'm using celery-django)? I've read the documentation, and I've googled, but I can't see a call like: my_example_task.state() == RUNNING My use-case is that I have an external (java) service for transcoding. When I send a document to be transcoded, I want to check if the task that runs that service is running, and if not, to (re)start it. I'm using the current stable versions - 2.4, I believe. Return the task_id (which is given from .delay()) and ask the celery instance afterwards about the state: x = method.delay(1,2) print

using class methods as celery tasks

大兔子大兔子 提交于 2019-11-26 19:34:34
问题 I'm trying to use the methods of class as the django-celery tasks, marking it up using @task decorator. The same situation is discribed here, asked by Anand Jeyahar. It's something like this class A: @task def foo(self, bar): ... def main(): a = A() ... # what i need a.foo.delay(bar) # executes as celery task a.foo(bar) # executes locally The problem is even if i use class instance like this a.foo.delay(bar) it says, that foo needs at least two arguments, which meens that self pointer misses.

How to check task status in Celery?

吃可爱长大的小学妹 提交于 2019-11-26 08:01:31
问题 How does one check whether a task is running in celery (specifically, I\'m using celery-django)? I\'ve read the documentation, and I\'ve googled, but I can\'t see a call like: my_example_task.state() == RUNNING My use-case is that I have an external (java) service for transcoding. When I send a document to be transcoded, I want to check if the task that runs that service is running, and if not, to (re)start it. I\'m using the current stable versions - 2.4, I believe. 回答1: Return the task_id