celery

rabbitmq queues filling up with celery tasks

社会主义新天地 提交于 2020-01-15 18:48:30
问题 I am using Celery to call multiple hardware units by their ip address. Each unit will return a list of values. Application code below # create a list of tasks modbus_calls = [] for site in sites: call = call_plc.apply_async((site.name, site.address), expires=120) # expires after 2 minutes? modbus_calls.append(call) # below checks all tasks are complete (values returned), then move forward out of the while loop ready_list = [False] while not all(ready_list): ready_list = [] for task in modbus

How to recursively chain a Celery task that returns a list into a group?

白昼怎懂夜的黑 提交于 2020-01-15 09:29:50
问题 I started from this question: How to chain a Celery task that returns a list into a group? But I want to expand twice. So in my use case I have: task A: determines total number of items for a given date task B: downloads 1000 metadata entries for that date task C: download the content for one item So each step I'm expanding the number of items of the next step. I can do it by looping through the results in my task and calling .delay() on the next task function. But I thought I'd try to not

Celery does not registering tasks

醉酒当歌 提交于 2020-01-15 08:52:05
问题 Hello! I just started to use Celery with Django. I've a task that i need to be periodic. In admin interface I can see my task in dropdown list named "Task (registered):". But when Celery Beat tries to execute it NotRegistered exception is thrown. Python 3.5.2, Django 1.11.4, Celery 4.1, django-celery-beat 1.1.0, django-celery-results 1.0.1 Part of settings.py related to celery: CELERY_BROKER_URL = 'amqp://user:*****@192.168.X.X/proj' CELERY_ACCEPT_CONTENT = ['json'] CELERY_RESULT_BACKEND =

django start celery daemon in production with supervisor

北战南征 提交于 2020-01-15 06:39:26
问题 I have created a conf file for supervisor inside /etc/supervisor/conf.d/myproject-celery.conf My configuration file looks like: [program:celery] command=/var/www/html/project/venv/bin/python /var/www/html/project/manage.py celeryd --loglevel=INFO environment=PYTHONPATH=/var/www/html/project directory=/var/www/html/project user=www-data numprocs=1 stdout_logfile=/var/log/celeryd.log stderr_logfile=/var/log/celeryd.log autostart=true autorestart=true startsecs=10 stopwaitsecs = 600 priority=998

Getting parameters of a failed django-celery task

戏子无情 提交于 2020-01-14 14:36:29
问题 Is it possible to get the arguments used to call a particular failed celery task given the task's ID? I am using MongoDB as the broker and using the django-celery package. I know that you can get the result pretty easily but wanted to know if you can do the same with the arguments used to call that task. Thanks 回答1: I managed to solve this problem by implementing a custom on_failure handler for my task as specified here: http://docs.celeryproject.org/en/latest/userguide/tasks.html#handlers I

Getting parameters of a failed django-celery task

扶醉桌前 提交于 2020-01-14 14:35:06
问题 Is it possible to get the arguments used to call a particular failed celery task given the task's ID? I am using MongoDB as the broker and using the django-celery package. I know that you can get the result pretty easily but wanted to know if you can do the same with the arguments used to call that task. Thanks 回答1: I managed to solve this problem by implementing a custom on_failure handler for my task as specified here: http://docs.celeryproject.org/en/latest/userguide/tasks.html#handlers I

Django: djcelery Import error from celery import current_app as celery in virtualenv

青春壹個敷衍的年華 提交于 2020-01-14 14:12:05
问题 Okay so I have tried everything I and google can come up. I'm trying to run django-celery under a virtualenv on my Macbook Pro OSX 10.8.4. I installed django-celery using pip while the virtualenv was activated. I get the following when importing djcelery in virtualenv python. (platform)Chriss-MacBook-Pro:platform Chris$ python Python 2.7.2 (default, Oct 11 2012, 20:14:37) [GCC 4.2.1 Compatible Apple Clang 4.0 (tags/Apple/clang-418.0.60)] on darwin Type "help", "copyright", "credits" or

Celery task with multiple decorators not auto registering task name

给你一囗甜甜゛ 提交于 2020-01-14 07:29:12
问题 I'm having a task that looks like this from mybasetask_module import MyBaseTask @task(base=MyBaseTask) @my_custom_decorator def my_task(*args, **kwargs): pass and my base task looks like this from celery import task, Task class MyBaseTask(Task): abstract = True default_retry_delay = 10 max_retries = 3 acks_late = True The problem I'm running into is that the celery worker is registering the task with the name 'mybasetask_module.__inner' The task is registerd fine (which is the package+module

Task state and django-celery

假如想象 提交于 2020-01-14 07:16:05
问题 I use django-celery and have task like this: class TestTask(Task): name = "enabler.test_task" def run(self, **kw): debug_log("begin test task") time.sleep(5) debug_log("end test task") def on_success(self, retval, task_id, args, kwargs): debug_log("on success") def on_failure(self, retval, task_id, args, kwargs): debug_log("on failure") I use django shell to run task: python manage.py shell r = tasks.TestTask().delay() From celery log I see that task is executed: [2012-01-16 08:13:29,362:

Cloning a celery chain

送分小仙女□ 提交于 2020-01-14 04:46:12
问题 I have an interesting issue attempting to clone a celery chain for use in a group, my intended use case is something like group([chain.clone(args=args) for args in it]) however it keeps complaining about not having enough arguments. I have broken this down using the below in a file named tasks.py @app.task def add(x,y): return x+y and then from the python shell >>> from tasks import add >>> chain=add.s()|add.s(1) >>> chain magic_carpet.celery.add() | add(1) >>> chain.args () >>> chain.delay(2