celery

Flush Flower database occasionally and/or exit gracefully from Docker?

家住魔仙堡 提交于 2019-12-11 13:55:49
问题 I'm running Celery Flower in Docker (see this question for details). The command ends up being: celery -A proj flower --persistent=True --db=/flower/flower I've got a persistent volume all set up on /flower . However, it looks like Flower never writes anything to its database file, even after 30 minutes of uptime (during which ~120 tasks were processed): -rw-r--r-- 1 user user 0 Mar 11 00:08 flower.bak -rw-r--r-- 1 user user 0 Mar 10 23:29 flower.dat -rw-r--r-- 1 user user 0 Mar 11 00:08

Django celery not finding celery module

别来无恙 提交于 2019-12-11 13:25:03
问题 I'm trying to set up celery with Django from the tuts but I keep getting ModuleNotFoundError: No module named 'celery' I have a main project called Tasklist with the structure: - Tasklist/ - manage.py - Tasklist/ - __init__.py - settings.py - celery.py - urls.py My init .py is as follows: from __future__ import absolute_import, unicode_literals from .celery import app as celery_app __all__ = ['celery_app'] And my celery.py is like so: from __future__ import absolute_import, unicode_literals

使用supervisor后台运行celery

会有一股神秘感。 提交于 2019-12-11 13:02:37
使用supervisor后台运行celery 1. 安装supervisor pip install supervisor 2. 配置supervisor 2.1 生成默认配置文件 # 生成的配置文件可指定路径 echo_supervisord_conf > /etc/supervisord.conf 2.2 修改配置文件 最后一行添加 [program:celery.worker] ;指定运行目录 directory=/root/my_prj ;运行目录下执行命令,日志保存在项目目录下 command=celery -A my_prj worker --loglevel info --logfile pro_celery_worker.log ;启动设置 numprocs=1 ;进程数 autostart=true ;当supervisor启动时,程序将会自动启动 autorestart=true ;自动重启 ;停止信号,默认TERM stopsignal=INT 如需启动多个,则新增一个块,比如下面 [program:pro.celery.worker] ;指定运行目录 directory=/root/my_prj1 ;运行目录下执行命令,日志保存在项目目录下 command=celery -A my_prj1 worker --loglevel info --logfile

Celery raise error while passing my queryset obj as parameter

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-11 12:07:55
问题 I trying to execute a periodic task, so I used celery with Django 1.8 and Django Rest Framework and Postgres as Database. When I try to send my obj to the task I get TypeError: foreign_model_obj is not JSON serializable . How can I pass my queryset object to my Task. views.py : class MyModelCreateApiView(generics.CreateAPIView): queryset = MyModel.objects.all() serializer_class = MyModelSerializer authentication_classes = (TokenAuthentication,) def create(self, request, *args, **kwargs): data

Running Celery as Daemon with Supervisor and Django on Elastic Beanstalk

ぐ巨炮叔叔 提交于 2019-12-11 11:57:59
问题 I am attempting to get Celery to run on an EB environment with Django and I have gotten super close to getting it all running. I have this config file for supervisor #!/usr/bin/env bash # Get django environment variables celeryenv=`cat /opt/python/current/env | tr '\n' ',' | sed 's/export //g' | sed 's/$PATH/%(ENV_PATH)s/g' | sed 's/$PYTHONPATH//g' | sed 's/$LD_LIBRARY_PATH//g'` celeryenv=${celeryenv%?} # Create celery configuraiton script celeryconf="[program:celeryd-worker] ; Set full path

Celery log shows cleanup failed

做~自己de王妃 提交于 2019-12-11 11:34:02
问题 I am using celery with django. I see an error when I lookup the celery log for the automatically scheduled cleanup. I am not sure what this means, and the implications of not doing the cleanup. Any help is appreciated. [2013-09-28 23:00:00,204: ERROR/MainProcess] Task celery.backend_cleanup[65af1634-374a-4068-b1a5-749b70f7c78d] raised exception: NotImplementedError('No updates',) Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/celery-3.0.15-py2.7.egg/celery

Celery Python revoke

落爺英雄遲暮 提交于 2019-12-11 10:54:10
问题 software -> celery:3.1.20 (Cipater) kombu:3.0.35 py:2.7.6 billiard:3.3.0.22 py-amqp:1.4.9 platform -> system:Linux arch:64bit, ELF imp:CPython loader -> celery.loaders.default.Loader settings -> transport:amqp results:amqp Currently I have the following function: task(bind=True, default_retry_delay=300, max_retries=3) def A(self,a,b,c,**kwargs) . B() . . . code This is the function I call to cancel A. task(bind=True, default_retry_delay=300, max_retries=3) def cancelA(self,a,b,c,**kwargs)

Using celery in a django app to run script with root priviliges?

穿精又带淫゛_ 提交于 2019-12-11 10:43:51
问题 I need to run some commands on my ubuntu box where my django project resides that requires root privileges, from within my django project. I'm using celery to fire off an asynch process, this process in turn calls shell commands that requires root privileges to succeed. How can I do this without risking creating huge security holes? PS! The shell commands I need to call are smbpasswd, edit /etc/samba/smb.conf and restart the samba service. 来源: https://stackoverflow.com/questions/3767841/using

Do subtasks inherit the queue of their parent task?

只愿长相守 提交于 2019-12-11 10:07:29
问题 When creating a subtask (i.e. chord, chain, group) with Celery, and you have a multiple queues (i.e. high priority, low priority), does the subtask inherit the routing parameters of the task who created it? 回答1: Answering my own question having actually read the source... Short answer No. Long answer Tasks instantiated with mytask.s() and mytask.si() call celery.app.Task.subtask() (called signature() in master), which does not set any routing information. Compare this to retry() which calls

Celery not using all concurrent slots

纵饮孤独 提交于 2019-12-11 09:25:24
问题 I have a Celery cluster made up of machines with 8-core processors. Each machine has one worker that is set to a concurrency factor of 8 (-c8). I often see nodes with a lot of reserved tasks, but only one or two are running simultaneously. My tasks are often long-running with a lot of compute and I/O. Any ideas as to why this is happening, and what I can do to increase the number of tasks simultaneously running? Does celery throttle the number of active tasks based on system load? I looked