django-celery

How to use Supervisor + Django + Celery with multiple Queues and Workers?

时光怂恿深爱的人放手 提交于 2019-12-03 10:30:34
问题 I'm using Celery + Django + Supervisord and I'm trying to setup a "priority" by creating 3 different queues (as suggested at https://stackoverflow.com/a/15827160/54872). Is there a way to start celery beat and workers for each queue in one command for supervisor? Or, do I need to make different supervisor conf files for each queue/worker pool and one for celery beat? 回答1: You can create program sections for each queue and combine them in a group section: [program:worker1] command=celery

Celery auto reload on ANY changes

烈酒焚心 提交于 2019-12-03 10:04:55
I could make celery reload itself automatically when there is changes on modules in CELERY_IMPORTS in settings.py . I tried to give mother modules to detect changes even on child modules but it did not detect changes in child modules. That make me understand that detecting is not done recursively by celery. I searched it in the documentation but I did not meet any response for my problem. It is really bothering me to add everything related celery part of my project to CELERY_IMPORTS to detect changes. Is there a way to tell celery that "auto reload yourself when there is any changes in

Create celery tasks then run synchronously

戏子无情 提交于 2019-12-03 09:33:11
问题 My app gathers a bunch of phone numbers on a page. Once the user hits the submit button I create a celery task to call each number and give a reminder message then redirect them to a page where they can see the live updates about the call. I am using web sockets to live update the status of each call and need the tasks to execute synchronously as I only have access to dial out from one number. So once the first call/task is completed, I want the next one to fire off. I took a look at CELERY

Celery: Worker with concurrency and reserved tasks only running 1 task

两盒软妹~` 提交于 2019-12-03 09:00:38
问题 Some of the tasks in my code were taking longer and longer to execute. Upon inspection I noticed that although I have my worker node set to concurrency 6, and 6 processes exist to 'do work', but only 1 task is shown under 'running tasks'. Here is a little visual proof: Here are the worker options: And here is the task tab for that worker with only 1 running process: I have found that if I restart celery, the concurrency is once again respected and i will see >1 running task, but after some

Python Celery versus Threading Library for running async requests [closed]

醉酒当歌 提交于 2019-12-03 06:10:32
Closed . This question is opinion-based. It is not currently accepting answers. Learn more . Want to improve this question? Update the question so it can be answered with facts and citations by editing this post . I am running a python method that parses a lot of data. Since it is time intensive, I would like to run it asynchronously on a separate thread so the user can still access the website/UI. Do threads using the "from threading import thread" module terminate if a user exits the site or do they continue to run on the server? What would be the advantages of using Celery versus simply

Recover from task failed beyond max_retries

江枫思渺然 提交于 2019-12-03 05:46:30
问题 I am attempting to asynchronously consume a web service because it takes up to 45 seconds to return. Unfortunately, this web service is also somewhat unreliable and can throw errors. I have set up django-celery and have my tasks executing, which works fine until the task fails beyond max_retries . Here is what I have so far: @task(default_retry_delay=5, max_retries=10) def request(xml): try: server = Client('https://www.whatever.net/RealTimeService.asmx?wsdl') xml = server.service

Examples of Django and Celery: Periodic Tasks

我的梦境 提交于 2019-12-03 05:08:36
问题 I have been fighting the Django/Celery documentation for a while now and need some help. I would like to be able to run Periodic Tasks using django-celery. I have seen around the internet (and the documentation) several different formats and schemas for how one should go about achieving this using Celery... Can someone help with a basic, functioning example of the creation, registration and execution of a django-celery periodic task? In particular, I want to know whether I should write a task

Starting Celery: AttributeError: 'module' object has no attribute 'celery'

女生的网名这么多〃 提交于 2019-12-03 05:08:06
I try to start a Celery worker server from a command line: celery -A tasks worker --loglevel=info The code in tasks.py: import os os.environ[ 'DJANGO_SETTINGS_MODULE' ] = "proj.settings" from celery import task @task() def add_photos_task( lad_id ): ... I get the next error: Traceback (most recent call last): File "/usr/local/bin/celery", line 8, in <module> load_entry_point('celery==3.0.12', 'console_scripts', 'celery')() File "/usr/local/lib/python2.7/site-packages/celery-3.0.12-py2.7.egg/celery/__main__.py", line 14, in main main() File "/usr/local/lib/python2.7/site-packages/celery-3.0.12

Django Celery: Admin interface showing zero tasks/workers

大憨熊 提交于 2019-12-03 04:48:28
问题 I've setup Celery with Django ORM as back-end. Trying to monitor what's going on behind the scene. I've started celeryd with -E flag python manage.py celeryd -E -l INFO -v 1 -f /path/to/celeryd.log Started celerycam with default snapshot frequency of 1 second. python mannage.py celerycam I can see the tasks being executed(in the celery log) and results being stored(data models periodically being changed by those tasks). However the Task/Worker pages in Django admin panel showing zero items.

Django Celery ConnectionError: Too many heartbeats missed

十年热恋 提交于 2019-12-03 03:09:57
Question How can I solve the ConnectionError: Too many heartbeats missed from Celery? Example Error [2013-02-11 15:15:38,513: ERROR/MainProcess] Error in timer: ConnectionError('Too many heartbeats missed', None, None, None, '') Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/celery/utils/timer2.py", line 97, in apply_entry entry() File "/app/.heroku/python/lib/python2.7/site-packages/celery/utils/timer2.py", line 51, in __call__ return self.fun(*self.args, **self.kwargs) File "/app/.heroku/python/lib/python2.7/site-packages/celery/utils/timer2.py",