celery

init.d celery script for CentOS?

本秂侑毒 提交于 2019-12-04 15:22:24
I'm writing a Django app that uses celery. So far I've been running on Ubuntu, but I'm trying to deploy to CentOS. Celery comes with a nice init.d script for Debian-based distributions, but it doesn't work on RedHat-based distributions like CentOS because it uses start-stop-daemon. Does anybody have an equivalent one for RedHat that uses the same variable conventions so I can reuse my /etc/default/celeryd file? Camilo Nova Is better solved here: Celery CentOS init script You should be good using that one Since I didn't get an answer, I tried to roll my own: #!/bin/sh # # chkconfig: 345 99 15 #

Creating queues dynamically with Celery

人走茶凉 提交于 2019-12-04 15:03:20
问题 I am writing a mailing-list manager using Django, Celery, and RabbitMQ. When a message comes in, a task is executed for each recipient. All tasks go to a single queue, and one or more workers consume tasks from the queue, constructing the email messages and sending them. The single queue causes a fairness problem: if a message comes in to a large mailing list, a large number of tasks are added to the queue, and other messages cannot get through to other, smaller mailing lists until all the

Access named volume from container when not running as root?

戏子无情 提交于 2019-12-04 13:47:17
I'm running Celery under Docker Compose. I'd like to make Celery's Flower persistent. So I do: version: '2' volumes: [...] flower_data: {} [...] flower: image: [base code image] ports: - "5555:5555" volumes: - flower_data:/flower command: celery -A proj flower --port=5555 --persistent=True --db=/flower/flower However, then I get: IOError: [Errno 13] Permission denied: 'flower.dat' I ran the following to elucidate why: bash -c "ls -al /flower; whoami; celery -A proj flower --persistent=True --db=/flower/flower" This made it clear why: flower_1 | drwxr-xr-x 3 root root 4096 Mar 10 23:05 . flower

How can to run task in 5 minutes after finish previous task using celery-beat?

霸气de小男生 提交于 2019-12-04 13:15:35
I have a two tasks - a and b . Task a running in 5 minutes after finish previous task a . Task b running in 3 minutes after finish previous task b . How can I implement it? I'm use python 3.6.8 , Django 2.2.6 and celery 4.3.0 ? The short answer is that you can't do this with celery beat because celery beat will trigger off of task start and not at task end. If you absolutely need to do it three minutes after the previous task ends, you'd be advised to just adding a call to .apply_async at the end of both a and b and kicking off each task once. 来源: https://stackoverflow.com/questions/58629790

How to get all tasks and periodic tasks in Celery [duplicate]

坚强是说给别人听的谎言 提交于 2019-12-04 12:55:38
问题 This question already has answers here : Closed 7 years ago . Possible Duplicate: How can I find all subclasses of a given class in Python? In my Django project, I have some subclass of Celery's Task and PeriodicTask : class CustomTask(Task): # stuff class CustomPeriodicTask(PeriodicTask): # stuff I need all Task classes to add some custom logging configuration. So I thought I can us __subclasses__ , but this does not work: >>> Task.__subclasses__() [<unbound PeriodicTask>, <class handle

Differentiate celery, kombu, PyAMQP and RabbitMQ/ironMQ

半城伤御伤魂 提交于 2019-12-04 11:42:32
问题 I want to upload images to S3 server, but before uploading I want to generate thumbnails of 3 different sizes, and I want it to be done out of request/response cycle hence I am using celery. I have read the docs, here is what I have understood. Please correct me if I am wrong. Celery helps you manage your task queues outside the request response cycle. Then there is something called carrot/kombu - its a django middleware that packages tasks that get created via celery. Then the third layer

celerybeat - multiple instances & monitoring

孤街醉人 提交于 2019-12-04 11:17:04
问题 I'm having application built using celery and recently we got a requirement to run certain tasks on schedule. I think celerybeat is perfect for this, but I got few questions: Is it possible to run multiple celerybeat instances, so that tasks are not duplicated? How to make sure that celerybeat is always up & running? So far I read this: https://github.com/celery/celery/issues/251 and https://github.com/ybrs/single-beat It looks like a single instance of celerybeat should be running. I'm

Forking processes for every task in Celery

一个人想着一个人 提交于 2019-12-04 10:41:02
I currently use a C extension library for Python, but it seems to have memory leaks. Tasks that are run on my celeryd do something using this C extension library, and celeryd eats a lot of memory about a hour later. I cannot patch this C extension library in many reasons, but instead I want to fork processes for every task in Celery. Are there any such options for Celery? mher You can use CELERYD_MAX_TASKS_PER_CHILD option or --maxtasksperchild celeryd switch. To restart worker processes after every task: CELERYD_MAX_TASKS_PER_CHILD=1 https://celery.readthedocs.org/en/latest/userguide/workers

import error in celery

為{幸葍}努か 提交于 2019-12-04 10:04:28
问题 this is the code which i am running: from __future__ import absolute_import from celery import Celery celery1 = Celery('celery',broker='amqp://',backend='amqp://',include=['tasks']) celery1.conf.update( CELERY_TASK_RESULT_EXPIRES=3600, ) if __name__ == '__main__': celery1.start() when i execute the above code it gives me the following error: ImportError: cannot import name Celery 回答1: I ran into this same error as well and renaming the file fixed it. For anyone else encountering this, the

Why I am Getting '_SIGCHLDWaker' object has no attribute 'doWrite' in Scrapy?

ε祈祈猫儿з 提交于 2019-12-04 10:01:38
问题 I am using Scrapy spiders inside Celery and I am getting this kind of errors randomly Unhandled Error Traceback (most recent call last): File "/usr/lib/python2.7/site-packages/twisted/python/log.py", line 103, in callWithLogger return callWithContext({"system": lp}, func, *args, **kw) File "/usr/lib/python2.7/site-packages/twisted/python/log.py", line 86, in callWithContext return context.call({ILogContext: newCtx}, func, *args, **kw) File "/usr/lib/python2.7/site-packages/twisted/python