celery

三.3短信-异步执行

喜欢而已 提交于 2020-01-13 16:01:35
上图过程中:请求云通服务器并云通发用户短信这个过程较耗时,所以将这个耗时过程的代码放到一个新的进程中去执行!! # 异步 - 问题:在视图中如果有耗时的代码,则用户的响应需要等待很长时间--等很久才能得到60秒倒计时 - 需要:让用户快速得到响应,倒计时 - 解决:将耗时代码放到异步中执行,如进程、线程、协程 - 新问题:所有底层的代码都已经被框架,如django封装好,此时如何使用异步操作 - 解决:使用celery,封装好了异步的代码 ### rabbitmq - 队列 - 安装rabbitmq - python虚拟环境中安装包pika ### celery的基础概念 - 构成 - 代理人broker:指定队列存储到哪里去 - 工人worker:从队列中取任务执行,本质就是一个新进程、线程、协程 - 队列queue:放任务,逐个执行----用rabbitmq作队列 - 任务task:耗时的代码 - 安装celery:pip install celery - 实现过程: - 新建celry_tasks包,用于写任务的代码 - 新建main.py,创建celery对象 - 新建config.py,指定配置,当前为指定rabbitmq为队列 - 新建sms包,新建tasks.py,在这个文件中写任务代码 - 定义方法send_sms,剪切耗时代码 - 添加装饰器@app.task()

how to get the queue in which a task was run - celery

两盒软妹~` 提交于 2020-01-13 14:05:12
问题 I'm new using celery and have a question. I have this simple task: @app.task(name='test_install_queue') def test_install_queue(): return subprocess.call("exit 0",shell=True) and I am calling this task later in a test case like result = tasks.test_default_queue.apply_async(queue="install") The task run successfully in the queue install (because I am seeing it in the celery log, and it completes fine. But I would like to know a programmatically way of finding in which queue was the task test

how to get the queue in which a task was run - celery

℡╲_俬逩灬. 提交于 2020-01-13 14:04:20
问题 I'm new using celery and have a question. I have this simple task: @app.task(name='test_install_queue') def test_install_queue(): return subprocess.call("exit 0",shell=True) and I am calling this task later in a test case like result = tasks.test_default_queue.apply_async(queue="install") The task run successfully in the queue install (because I am seeing it in the celery log, and it completes fine. But I would like to know a programmatically way of finding in which queue was the task test

Celery configure separate connection for producer and consumer

六眼飞鱼酱① 提交于 2020-01-13 10:55:13
问题 We have an application setup on heroku, which uses celery to run background jobs. The celery app uses RabbitMQ as the broker. We used heroku’s RabbitMQ Bigwig add-on as AMQP message broker. This add-on specifies two separate url one optimized for producer and other optimized for consumer. Also, as per RabbitMQ documentation it is recommended to use separate connections for producer and consumer. Celery documentation does not provide a ways to specify connections separately to producer and

init.d celery script for CentOS?

一个人想着一个人 提交于 2020-01-13 05:54:26
问题 I'm writing a Django app that uses celery. So far I've been running on Ubuntu, but I'm trying to deploy to CentOS. Celery comes with a nice init.d script for Debian-based distributions, but it doesn't work on RedHat-based distributions like CentOS because it uses start-stop-daemon. Does anybody have an equivalent one for RedHat that uses the same variable conventions so I can reuse my /etc/default/celeryd file? 回答1: Is better solved here: Celery CentOS init script You should be good using

How to configure CELERYBEAT_SCHEDULE in Django settings?

假如想象 提交于 2020-01-13 03:38:13
问题 I can get this to run as a standalone application, but I am having trouble getting it to work in Django. Here is the stand alone code: from celery import Celery from celery.schedules import crontab app = Celery('tasks') app.conf.update( CELERY_TASK_SERIALIZER='json', CELERY_RESULT_SERIALIZER='json', CELERY_ACCEPT_CONTENT=['json'], CELERY_TIMEZONE='US/Central', CELERY_ENABLE_UTC=True, CELERYBEAT_SCHEDULE = { 'test': { 'task': 'tasks.test', 'schedule': crontab(), }, } ) @app.task def test():

Running multiple instances of celery on the same server

情到浓时终转凉″ 提交于 2020-01-12 18:52:29
问题 I want to run two instances of celery on the same machine. One is for an 'A' version of my application, the other is for the 'B' version. I have two instances, which I start like this: (env1)/home/me/firstapp$ celery -A app.tasks worker --config celeryconfig (env2)/home/me/secondapp$ celery -A app.tasks worker -n Carrot --config celeryconfig In tasks.py in each application, I create a celery instance like this: celery = Celery('tasks', backend='amqp', broker='amqp://guest@127.0.0..1.5672//')

Running multiple instances of celery on the same server

喜夏-厌秋 提交于 2020-01-12 18:52:23
问题 I want to run two instances of celery on the same machine. One is for an 'A' version of my application, the other is for the 'B' version. I have two instances, which I start like this: (env1)/home/me/firstapp$ celery -A app.tasks worker --config celeryconfig (env2)/home/me/secondapp$ celery -A app.tasks worker -n Carrot --config celeryconfig In tasks.py in each application, I create a celery instance like this: celery = Celery('tasks', backend='amqp', broker='amqp://guest@127.0.0..1.5672//')

Framing Errors in Celery 3.0.1

江枫思渺然 提交于 2020-01-12 07:52:08
问题 I recently upgraded to Celery 3.0.1 from 2.3.0 and all the tasks run fine. Unfortunately. I'm getting a "Framing Error" exception pretty frequently. I'm also running supervisor to restart the threads but since these are never really killed supervisor has no way of knowing that celery needs to be restarted. Has anyone seen this before? 2012-07-13 18:53:59,004: ERROR/MainProcess] Unrecoverable error: Exception('Framing Error, received 0x00 while expecting 0xce',) Traceback (most recent call

Celery works, but with flower doesn't work

六眼飞鱼酱① 提交于 2020-01-12 07:45:09
问题 I have installed celery and RabitMQ and flower. I am able to browse to the flower port. I have the following simple worker that I can attach to celery and call from a python program: # -*- coding: utf-8 -*- """ Created on Sat Dec 12 16:37:33 2015 @author: idf """ from celery import Celery app = Celery('tasks', broker='amqp://guest@localhost//') @app.task def add(x, y): return x + y This program calls it # -*- coding: utf-8 -*- """ Created on Sat Dec 12 16:40:16 2015 @author: idf """ from