django-celery

Can celery celerybeat dynamically add/remove tasks in runtime?

时光总嘲笑我的痴心妄想 提交于 2019-12-10 23:20:16
问题 I have a project that does not include Django, so i can't use djcelery. But i found the modification of django-celery DatabaseSchedule that using sqlalchemy. It works fine as like djceley's DatabaseScheule did. But the only problem is that it doesn't seem to send tasks that were added in runtime, then i restart the celery-beat, the tasks that were added before will be sent successfully. So, is it possible to dynamically add/remove tasks without restarting celery-beat? Thanks for any advice.

Django Celery Memory not release

佐手、 提交于 2019-12-10 20:39:11
问题 In my django project I have the following dependencies: django==1.5.4 django-celery==3.1.9 amqp==1.4.3 kombu==3.0.14 librabbitmq==1.0.3 (as suggested by https://stackoverflow.com/a/17541942/1452356) In dev_settings.py: DEBUG = False BROKER_URL = "django://" import djcelery djcelery.setup_loader() CELERYBEAT_SCHEDULER = "djcelery.schedulers.DatabaseScheduler" CELERYD_CONCURRENCY = 2 # CELERYD_TASK_TIME_LIMIT = 10 CELERYD_TASK_TIME_LIMIT is commented as suggested here https://stackoverflow.com

How do I send channels 2.x group message from django-celery 3 task?

守給你的承諾、 提交于 2019-12-10 18:56:09
问题 I need to postpone sending channels message. Here is my code: # consumers.py class ChatConsumer(WebsocketConsumer): def chat_message(self, event): self.send(text_data=json.dumps(event['message'])) def connect(self): self.channel_layer.group_add(self.room_name, self.channel_name) self.accept() def receive(self, text_data=None, bytes_data=None): send_message_task.apply_async( args=( self.room_name, {'type': 'chat_message', 'message': 'the message'} ), countdown=10 ) # tasks.py @shared_task def

What are the advantages of celerybeat over cron?

懵懂的女人 提交于 2019-12-10 17:18:44
问题 I see many people preferring celerybeat over cron jobs for periodic tasks. I see the documentation for celerybeat and I can see information on how to use it, but not why (or when) I should prefer it over cronjobs. http://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html#introduction 回答1: I have used both and have come to the conclusion that beat is better at control than cron. You can wire it up so that your control is via django admin instead of sshing in and changing the

Celery time statistics per-task-name

做~自己de王妃 提交于 2019-12-10 16:49:08
问题 I have some fairly busy celery queues, but not sure which tasks are the problematic ones. Is there a way to aggregate results to figure out which tasks are taking a long time? I have 10-20 workers on 2-4 servers. Using redis as the broker and as the result backend as well. I noticed the busy queues on Flower, but can't figure out how to get time statistic aggregated per task. 回答1: Method 1: If you have enabled logging when celery workers are started, they log time taken for each task. $

How to start celery in background of terminal in Django

大兔子大兔子 提交于 2019-12-10 15:31:05
问题 I ma starting celery as python manage.py celeryd It is working but in foreground . Then to test commands i need to start another terminal and do stuff there. is there any way to start that in background. I tried this python manage.py celeryd & But then again it comes at foreground 回答1: You're looking for celeryd_detach , available since at least 2.4 python manage.py celeryd_detach 回答2: You can use this to get celeryd to work in the background $ nohup celeryd start & The above command pushes

Celery Storing unrecoverable task failures for later resubmission

青春壹個敷衍的年華 提交于 2019-12-10 06:46:37
问题 I'm using the djkombu transport for my local development, but I will probably be using amqp (rabbit) in production. I'd like to be able to iterate over failures of a particular type and resubmit. This would be in the case of something failing on a server or some edge case bug triggered by some new variation in data. So I could be resubmitting jobs up to 12 hours later after some bug is fixed or a third party site is back up. My question is: Is there a way to access old failed jobs via the

RabbitMQ/Celery/Django Memory Leak?

旧时模样 提交于 2019-12-10 01:23:48
问题 I recently took over another part of the project that my company is working on and have discovered what seems to be a memory leak in our RabbitMQ/Celery setup. Our system has 2Gb of memory, with roughly 1.8Gb free at any given time. We have multiple tasks that crunch large amounts of data and add them to our database. When these tasks run, they consume a rather large amount of memory, quickly plummeting our available memory to anywhere between 16Mb and 300Mb. The problem is, after these tasks

Starting Celery: AttributeError: 'module' object has no attribute 'celery'

血红的双手。 提交于 2019-12-09 05:09:02
问题 I try to start a Celery worker server from a command line: celery -A tasks worker --loglevel=info The code in tasks.py: import os os.environ[ 'DJANGO_SETTINGS_MODULE' ] = "proj.settings" from celery import task @task() def add_photos_task( lad_id ): ... I get the next error: Traceback (most recent call last): File "/usr/local/bin/celery", line 8, in <module> load_entry_point('celery==3.0.12', 'console_scripts', 'celery')() File "/usr/local/lib/python2.7/site-packages/celery-3.0.12-py2.7.egg

Django celery - asyncio - daemonic process are not allowed to have children

泄露秘密 提交于 2019-12-08 16:29:23
问题 I can see similar questions have been asked before but those are running multi processors and not executors. Therefore I am unsure how to fix this. the GitHub issue also say its resolved in 4.1 https://github.com/celery/celery/issues/1709 I am using celery==4.1.1 django-celery==3.2.1 django-celery-beat==1.0.1 django-celery-results==1.0.1 My script as as follows, ive tried to cut it down to show relevant code only. @asyncio.coroutine def snmp_get(ip, oid, snmp_user, snmp_auth, snmp_priv):