celerybeat

zoneinfo data corrupt, how do I compile new data?

ぃ、小莉子 提交于 2020-01-06 08:14:15
问题 Basically the same thing happened again as when I asked this question. However this time I cannot get it right again. I tried the answer of Burhan Khalid again and I get the same errors again. I also tried copy pasting the zoneinfo folder from a backup again, but this time it did not fix my errors. Version of Django = 1.4.5 Version of Celery = 3.0.8 Version of Django-Celery = 3.0.6 Version of pytz = 2013b (same as the files I am downloading) OS = Mac Mountain Lion Attempt 1: Clear the

Celery Async Tasks and Periodic Tasks together

你离开我真会死。 提交于 2020-01-05 07:01:08
问题 Unable to run periodic tasks along with asynchronous tasks together. Although, if I comment out the periodic task, asynchronous tasks are executed fine, else asynchronous tasks are stuck. Running: celery==4.0.2, Django==2.0, django-celery-beat==1.1.0, django-celery-results==1.0.1 Referred: https://github.com/celery/celery/issues/4184 to choose celery==4.0.2 version, as it seems to work. Seems to be a known issue https://github.com/celery/django-celery-beat/issues/27 I've also done some

How to make two tasks mutually exclusive in Celery?

♀尐吖头ヾ 提交于 2020-01-03 05:07:12
问题 Is there a way to disallow two different tasks to run simultaneously in Celery? I was thinking about defining a new queue with concurrency level=1, and send those tasks to that queue, but I couldn't find an example. Is that possible? Thanks! 回答1: Yes, if you don't need to worry about overall throughput it is possible to create a separate queue and have a dedicated worker with concurrency set to 1. You can create as many queues as you want and configure which of those queues each worker

How can I get rid of legacy tasks still in the Celery / RabbitMQ queue?

醉酒当歌 提交于 2019-12-24 08:26:09
问题 I am running Django + Celery + RabbitMQ. After modifying some task names I started getting "unregistered task" KeyErrors, even after removing tasks with this key from the Periodic tasks table in Django Celery Beat and restarting the Celery worker. They persist even after running with the --purge option. How can I get rid of them? 回答1: To flush out the last of these tasks, you can re-implement them with their old method headers, but no logic. For example, if you removed the method original and

How to run celery schedule instantly?

我怕爱的太早我们不能终老 提交于 2019-12-24 03:23:42
问题 I have a celery schedule which is configured like this: CELERYBEAT_SCHEDULE = { "runs-every-30-seconds": { "task": "tasks.refresh", "schedule": timedelta(hours=1) }, } After testing I find that this schedule is started after 1 hour, but I want to run this schedule instantly and again after 1 hour. 回答1: If you mean at startup, do it in AppConfig.ready() (new in django 1.7): # my_app/__init__.py: class MyAppConfig(AppConfig): def ready(self): tasks.refresh.delay() Also see: https://docs

How to programmatically generate celerybeat entries with celery and Django

喜夏-厌秋 提交于 2019-12-20 09:37:49
问题 I am hoping to be able to programmatically generate celerybeat entries and resync celerybeat when entries are added. The docs here state By default the entries are taken from the CELERYBEAT_SCHEDULE setting, but custom stores can also be used, like storing the entries in an SQL database. So I am trying to figure out which classes i need to extend to be able to do this. I have been looking at celery scheduler docs and djcelery api docs but the documentation on what some of these methods do is

celery daemon production local config file without django

狂风中的少年 提交于 2019-12-18 07:04:07
问题 I am newbie to Celery. I create a project as per instruction provided by the celery4.1 docs.Below is my project folder and files: mycelery | test_celery | celery_app.py tasks.py __init__.py 1-celery_app.py from __future__ import absolute_import import os from celery import Celery from kombu import Queue, Exchange from celery.schedules import crontab import datetime app = Celery('test_celery', broker='amqp://jimmy:jimmy123@localhost/jimmy_v_host', backend='rpc://', include=['test_celery.tasks'

Django 1.6 + RabbitMQ 3.2.3 + Celery 3.1.9 - why does my celery worker die with: WorkerLostError: Worker exited prematurely: signal 11 (SIGSEGV)

核能气质少年 提交于 2019-12-12 11:20:53
问题 This seems to address a very similar issue, but doesn't give me quite enough insight: https://github.com/celery/billiard/issues/101 Sounds like it might be a good idea to try a non-SQLite database... I have a straightforward celery setup with my django app. In my settings.py file I set a task to run as follows: CELERYBEAT_SCHEDULE = { 'sync_database': { 'task': 'apps.data.tasks.celery_sync_database', 'schedule': timedelta(minutes=5) } } I have followed the instructions here: http://celery

Django celerybeat periodic task only runs once

泪湿孤枕 提交于 2019-12-12 04:24:26
问题 I am trying to schedule a task that runs every 10 minutes using Django 1.9.8, Celery 4.0.2, RabbitMQ 2.1.4, Redis 2.10.5. These are all running within Docker containers in Linux (Fedora 25). I have tried many combinations of things that I found in Celery docs and from this site. The only combination that has worked thus far is below. However, it only runs the periodic task initially when the application starts, but the schedule is ignored thereafter. I have absolutely confirmed that the

Sharing an Oracle database connection between simultaneous Celery tasks

痴心易碎 提交于 2019-12-12 02:49:21
问题 I'm working with Python2.7, Celery and cx_Oracle to access the Oracle database. I create a lot of tasks. Each task runs a query through cx_Oracle. Many of this tasks will run simultaneously. All tasks should share the same database connection. If I only launch one task, the query gets run correctly. However, if I launch several queries, I start getting this error message: [2016-04-04 17:12:43,846: ERROR/MainProcess] Task tasks.run_query[574a6e7f-f58e-4b74-bc84-af4555af97d6] raised unexpected: