celery

Celery Beat: Limit to single task instance at a time

有些话、适合烂在心里 提交于 2020-02-26 18:22:19
问题 I have celery beat and celery (four workers) to do some processing steps in bulk. One of those tasks is roughly along the lines of, "for each X that hasn't had a Y created, create a Y." The task is run periodically at a semi-rapid rate (10sec). The task completes very quickly. There are other tasks going on as well. I've run into the issue multiple times in which the beat tasks apparently become backlogged, and so the same task (from different beat times) are executed simultaneously, causing

Django-设置预定作业?

梦想的初衷 提交于 2020-02-26 14:03:25
我一直在使用Django开发网络应用程序,并且很好奇是否有一种方法可以安排作业定期运行。 基本上,我只是想遍历数据库并定期自动进行一些计算/更新,但是我似乎找不到任何有关此操作的文档。 有人知道如何设置吗? 需要说明的是:我知道我可以设置 cron 作业来执行此操作,但是我很好奇Django中是否有某些功能可以提供此功能。 我希望人们能够自己部署此应用程序,而无需进行大量配置(最好为零)。 我已经考虑过通过简单地检查自从上次将请求发送到站点以来是否应该运行作业来“追溯地”触发这些操作,但是我希望有一些清洁的方法。 #1楼 Celery 是基于AMQP(RabbitMQ)构建的分布式任务队列。 它还以cron类的方式处理周期性任务(请参阅 周期性任务 )。 根据您的应用程序,可能值得一试。 用django( docs )设置Celery非常容易,并且在停机的情况下,定期任务实际上会跳过错过的任务。 如果任务失败,Celery还具有内置的重试机制。 #2楼 将以下内容放在cron.py文件的顶部: #!/usr/bin/python import os, sys sys.path.append('/path/to/') # the parent directory of the project sys.path.append('/path/to/project') # these

Teuthology平台部署手册

心已入冬 提交于 2020-02-26 11:07:31
Teuthology架构 自动化测试框架流程大致如下: 软件组成 Teuthology 测试框架所使用到的软件组成如下: Jenkins 持续集成工具 teuthology Ceph 测试套件 shaman 查询提供软件包的 chacra 节点,水平扩展和调度 chacra 节点 chacra 提供不同架构二进制包或文件的管理 REST API 接口 Teuthology部署 部署paddles/pulpito node 安装依赖 yum install git python-dev python-virtualenv postgresql postgresql-contrib postgresql-server-dev-all supervisor gcc epel-release 配置postgresql数据库 初始化数据库 postgresql-setup initdb systemctl start postgresql //启动数据库 systemctl enable postgresql postgresql安装并启动后,会自动生成postgres用户 su - postgres //进入该用户 -bash-4.2$ psql //进入数据库 postgres=# \password postgres //更改用户postgres的密码 //按照输出的提示输入密码即可

supervisor celery 配置

我与影子孤独终老i 提交于 2020-02-26 05:11:12
django app 部署中celery 托管 celery supervisor supervisor conf django_app_celery.conf 可以根据芹菜文档修改work 启动命令 [program:appname_celery] command=/appname/env/bin/celery -A sanqing worker -l debug user=root directory = /appname/ ; 程序的启动目录 autostart = true ; 在 supervisord 启动的时候也自动启动 startsecs = 5 ; 启动 5 秒后没有异常退出,就当作已经正常启动了 autorestart = true ; 程序异常退出后自动重启 startretries = 3 ; 启动失败自动重试次数,默认是 3 redirect_stderr = true ; 把 stderr 重定向到 stdout,默认 false ;stdout_logfile_maxbytes = 20MB ; stdout 日志文件大小,默认 50MB stdout_logfile_backups = 20 ; stdout 日志文件备份数 ; stdout 日志文件,需要注意当指定目录不存在时无法正常启动,所以需要手动创建目录(supervisord

Celery配置实践笔记

那年仲夏 提交于 2020-02-26 04:07:25
说点什么: 整理下工作中配置celery的一些实践,写在这里,一方面是备忘,另外一方面是整理成文档给其他同事使用。 演示用的项目,同时也发布在Github上: https://github.com/blackmatrix7/celery-demo 这份笔记会随着经验的积累,逐步调整完善,不过通常情况下,Github上的更新会比较快。 Celery 配置实践笔记 Celery 配置实践笔记,目前已记录: 异步执行任务 为不同任务分配不同的队列 计划任务 需要补充: 为不同的任务配置不同的优先级 Celey任务的返回结果 创建Celery 配置Celery参数 在创建celery实例之前,需要对celery的参数进行一些配置。 在这里列出一些比较常用的Celery配置项: 配置项名称 说明 CELERY_DEFAULT_QUEUE 默认的队列名称,当没有为task特别指定队列时,采用此队列 CELERY_BROKER_URL 消息代理,用于发布者传递消息给消费者,推荐RabbitMQ CELERY_RESULT_BACKEND 后端,用于存储任务执行结果,推荐redis CELERY_TASK_SERIALIZER 任务的序列化方式 CELERY_RESULT_SERIALIZER 任务执行结果的序列化方式 CELERY_ACCEPT_CONTENT CELERYD

How to provide user constant notification about Celery's Task execution status?

爷,独闯天下 提交于 2020-02-25 13:26:26
问题 I integrated my project with celery in this way, inside views.py after receving request from the user def upload(request): if "POST" == request.method: # save the file task_parse.delay() # continue and in tasks.py from __future__ import absolute_import from celery import shared_task from uploadapp.main import aunit @shared_task def task_parse(): aunit() return True In short, the shared task will run a function aunit() from a third python file located in uploadapp/ directory named main.py .

How to provide user constant notification about Celery's Task execution status?

余生长醉 提交于 2020-02-25 13:24:51
问题 I integrated my project with celery in this way, inside views.py after receving request from the user def upload(request): if "POST" == request.method: # save the file task_parse.delay() # continue and in tasks.py from __future__ import absolute_import from celery import shared_task from uploadapp.main import aunit @shared_task def task_parse(): aunit() return True In short, the shared task will run a function aunit() from a third python file located in uploadapp/ directory named main.py .

Celery daemon - how to configure it to run multiple tasks from multiple Flask applications?

流过昼夜 提交于 2020-02-23 07:24:49
问题 I have a flask app myapp_A that uses celery to run some asynchronous tasks. And I have configured celery to run as a daemon process. Here is the service script. /etc/default/celery: # Name of nodes to start CELERYD_NODES="w1" # Absolute or relative path to the 'celery' command: CELERY_BIN="/var/www/myapp_A.com/public_html/venv/bin/celery" # App instance to use CELERY_APP="myapp_A.celery" # Where to chdir at start. CELERYD_CHDIR="/var/www/myapp_A.com/public_html/" # Extra command-line

Celery: Interact/Communicate with a running task

家住魔仙堡 提交于 2020-02-22 08:25:45
问题 A related (albeit not identical) question appears here: Interact with celery ongoing task It's easy to start a task and get its unique ID: async_result = my_task.delay() task_id = async_result.task_id It's easy to broadcast a message that will reach a custom command in the worker: my_celery_app.control.broadcast('custom_command', arguments= {'id': task_id}) The problem arises that the worker is started in the form of a small process tree formed of one supervisor and a number of children. The

Celery: Interact/Communicate with a running task

北慕城南 提交于 2020-02-22 08:24:03
问题 A related (albeit not identical) question appears here: Interact with celery ongoing task It's easy to start a task and get its unique ID: async_result = my_task.delay() task_id = async_result.task_id It's easy to broadcast a message that will reach a custom command in the worker: my_celery_app.control.broadcast('custom_command', arguments= {'id': task_id}) The problem arises that the worker is started in the form of a small process tree formed of one supervisor and a number of children. The