celery

Unable to start Airflow worker/flower and need clarification on Airflow architecture to confirm that the installation is correct

倖福魔咒の 提交于 2019-11-30 06:57:24
问题 Running a worker on a different machine results in errors specified below. I have followed the configuration instructions and have sync the dags folder. I would also like to confirm that RabbitMQ and PostgreSQL only needs to be installed on the Airflow core machine and does not need to be installed on the workers (the workers only connect to the core). The specification of the setup is detailed below: Airflow core/server computer Has the following installed: Python 2.7 with airflow (AIRFLOW

How do I schedule a task with Celery that runs on 1st of every month?

跟風遠走 提交于 2019-11-30 06:56:50
How do I schedule a task with celery that runs on 1st of every month? Since Celery 3.0 the crontab schedule now supports day_of_month and month_of_year arguments: http://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html#crontab-schedules You can do this using Crontab schedules and you cand define this either: in your django settings.py : from celery.schedules import crontab CELERYBEAT_SCHEDULE = { 'my_periodic_task': { 'task': 'my_app.tasks.my_periodic_task', 'schedule': crontab(0, 0, day_of_month='1'), # Execute on the first day of every month. }, } in celery.py config: from

Disable Django Debugging for Celery

笑着哭i 提交于 2019-11-30 06:50:09
Is it possible to set DEBUG=False for only a specific app in Django? Celery has a known memory leak when debugging is enabled. I have a development server where I want Celery to run as a service, without debugging so it doesn't leak memory, but I want the rest of my Django app to use debugging so errors will be shown when testing. Celery doesn't have a memory leak, it's how Django works: When DEBUG is enabled Django appends every executed SQL statement to django.db.connection.queries , this will grow unbounded in a long running process environment. I guess you could use a hack like: if

how to configure and run celery worker on remote system

余生颓废 提交于 2019-11-30 06:31:14
问题 i am working on celery and using rabbitmq server and created a project in django project in a server(where message queue,database exists) and it is working fine, i have created multiple workers also from kombu import Exchange, Queue CELERY_CONCURRENCY = 8 CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'msgpack', 'yaml'] CELERY_RESULT_BACKEND = 'amqp' CELERYD_HIJACK_ROOT_LOGGER = True CELERY_HIJACK_ROOT_LOGGER = True BROKER_URL = 'amqp://guest:guest@localhost:5672//' CELERY_QUEUES = ( Queue(

Why does celery add thousands of queues to rabbitmq that seem to persist long after the tasks completel?

戏子无情 提交于 2019-11-30 06:30:36
I am using celery with a rabbitmq backend. It is producing thousands of queues with 0 or 1 items in them in rabbitmq like this: $ sudo rabbitmqctl list_queues Listing queues ... c2e9b4beefc7468ea7c9005009a57e1d 1 1162a89dd72840b19fbe9151c63a4eaa 0 07638a97896744a190f8131c3ba063de 0 b34f8d6d7402408c92c77ff93cdd7cf8 1 f388839917ff4afa9338ef81c28aad75 0 8b898d0c7c7e4be4aa8007b38ccc00ea 1 3fb4be51aaaa4ac097af535301084b01 1 This seems to be inefficient, but further I have observed that these queues persist long after processing is finished. I have found the task that appears to be doing this:

Detect whether Celery is Available/Running

天大地大妈咪最大 提交于 2019-11-30 06:17:37
问题 I'm using Celery to manage asynchronous tasks. Occasionally, however, the celery process goes down which causes none of the tasks to get executed. I would like to be able to check the status of celery and make sure everything is working fine, and if I detect any problems display an error message to the user. From the Celery Worker documentation it looks like I might be able to use ping or inspect for this, but ping feels hacky and it's not clear exactly how inspect is meant to be used (if

celery beat 之Pidfile (celerybeat.pid) already exists报错

南笙酒味 提交于 2019-11-30 06:00:53
celery beat 之Pidfile (celerybeat.pid) already exists报错 今天在django中利用celery来自动添加任务时,执行 celery beat -A celery_task -l info 启动一个添加任务的服务时报错:Pidfile (celerybeat.pid) already exists 报错原因 celery beat 在运行时,会自动创建两个文件: pidfile :默认为 celerybeat.pid ,保存在项目根目录。 scheduler :默认为 celerybeat-schedule ,保存在项目根目录。 这里的报错说明 pidfile 已存在。上次运行的时候,已经自动创建了,进程结束的时候并未自动删除,从而导致再次运行的时候报错了。 解决方法 直接删除这个 pidfile 文件,再次启动 celery beat celery beat -A celery_task -l info 运行成功了。 那么难道每次重启都得先删除么?参考网上说法,如果在启动celery beat的时候配置pidfile参数,并将该参数设置为空,可以规避这个问题。 来源: https://www.cnblogs.com/863652104kai/p/11565764.html

Creating a Celery worker using node.js

本小妞迷上赌 提交于 2019-11-30 05:43:14
Using node-celery , we can enable node to push Celery jobs to the task queue. How can we allow node to be a Celery worker and consume the queue? For Celery if the end point is amqp. Checkout Celery.js Github any node process started as amqp consumer would work fine. For every other self.conf.backend_type types you can have varied consumer. Following example is merely for amqp. One such example. The message below may be the Celery task object. var amqp = require('amqp'); var connection = amqp.createConnection({ host: "localhost", port: 5672 }); connection.on('ready', function () { connection

How can I disable the Django Celery admin modules?

喜夏-厌秋 提交于 2019-11-30 05:18:01
I have no need to the celery modules in my Django admin. Is there a way I could remove it? okm To be more specific, in admin.py of any app inside INSTALLED_APPS after 'djcelery' from django.contrib import admin from djcelery.models import ( TaskState, WorkerState, PeriodicTask, IntervalSchedule, CrontabSchedule) admin.site.unregister(TaskState) admin.site.unregister(WorkerState) admin.site.unregister(IntervalSchedule) admin.site.unregister(CrontabSchedule) admin.site.unregister(PeriodicTask) You can simply unregister celerys models like admin.site.unregister(CeleryModelIdoNotWantInAdmin) 来源:

RabbitMQ / Celery with Django hangs on delay/ready/etc - No useful log info

天大地大妈咪最大 提交于 2019-11-30 05:17:30
问题 So I just setup celery and rabbitmq, created my user, setup the vhost, mapped the user to the vhost, and ran the celery daemon succesfully (or so I assume) (queuetest)corky@corky-server:~/projects/queuetest$ ./manage.py celeryd celery@corky-server v0.9.5 is starting. Configuration -> . broker -> amqp://celery@localhost:5672/ . queues -> . celery -> exchange:celery (direct) binding:celery . concurrency -> 2 . loader -> celery.loaders.djangoapp . logfile -> [stderr]@WARNING . events -> OFF .