celery

invoke celery task from tornado [duplicate]

风格不统一 提交于 2019-11-30 16:19:22
问题 This question already has answers here : Tornado celery integration hacks (4 answers) Closed 6 years ago . How can someone invoke a celery task from tornado, and get the result via a callback? This post claims that someone must simply put a message via RabbitMQ and then the task shall be executed. This makes sense, but can someone give an example in python (even better in tornado, with a callback)? Personally, I use mongodb as my message broker, but I can switch to Redis or RabbitMQ as well..

redis堵死致数据清空

白昼怎懂夜的黑 提交于 2019-11-30 15:56:56
情景: zy的链路监控突然都恢复,而且在哪个时间段zabbix中显示回复,也发送了告警,但是实际上告警并没有发出来。这是不可能的情况,应该是redis缓存中的数据都被清空了,没有认为干预,需解决问题 思路: 先检查代码,代码中只要有cache.get,就有cache.set,而且celery的周期是30s,那几个键的声明周期默认都是300s,不存在内存过期的情况。然后确认代码无误,开始检查zabbix的问题历史,检查zabbix的审计报表,检查消息平台的celery日志,检查系统日志,发现在21.02zabbix中消息发送动作完成,但是消息平台并没有任何告警过来,随即检查监控系统的celery日志,20.50-21.10之间celery给报错了,问题就出现在这里,。 消息平台celery 消息平台日志 zabbix告警记录在同一个点都给恢复了, 监控系统celery的日志 解决: 注释掉产生错误的task任务,保证celery的正常运行。 其实这个报错有一段时间了,但是那时候没有影响正常业务,以为没事,然后就给爆了。小问题不能心里给他过了,不然肯定会炸。一定要解决。 来源: https://www.cnblogs.com/0916m/p/11602888.html

falsk 使用celery后台执行任务

依然范特西╮ 提交于 2019-11-30 15:13:10
# falsk 使用celery后台执行任务 1.基础环境搭建 doc:https://flask.palletsprojects.com/en/1.0.x/patterns/celery/ mkdir celery_tasks init .py # 实例化celery from celery import Celery # celery my_celery = Celery('my_celery') task_1.py # celery任务 from celery_tasks import my_celery @my_celery.task def add_together(a, b): #print('add_together执行中。。。') return a + b app.__init__ # celery实例配置 from .app import create_app, db flask_app = create_app() from celery_tasks import my_celery from .app import _handle_celery_task _handle_celery_task(flask_app, my_celery) 配置方法位于app.app.py def _handle_celery_task(app, my_celery): #

celery 使用 - 3

我只是一个虾纸丫 提交于 2019-11-30 15:13:05
# celery 使用 1.broker 2.基础案例 使用redis作为broker和brokend。 创建tasks.py # tasks.py di = 'redis://:****@localhost:6379/0' app = Celery('tasks', backend=di, broker=di) @app.task def add(x, y): return x + y 运行: celery -A tasks worker -l info -P eventlet 创建temp.py # temp.py from tasks import add rv = add.delay(4, 4) 2.1 运行结果: 运行tasks E:\python\code test>celery -A tasks worker -l info -P eventlet -------------- celery@*** v4.3.0 (rhubarb) ---- **** ----- --- * *** * -- Windows0 2019-09-21 22:08:04 -- * - **** --- - ** ---------- [config] - ** ---------- .> app: tasks:0x1aebfdcf98 - ** ---------- .>

celery task - 2

丶灬走出姿态 提交于 2019-11-30 15:12:43
# celery task 前言 讨论一个定时任务,一般而言,需要的功能如下: 封装成对象,独立执行; 对象有一些接口,便于了解它的状态; 定时调用; 行为控制,包括重试,成功/失败回调等; 下面分别介绍celery的这些功能实现。 1.task basic celery的task基础类是tasks.Task() 1.1 bound tasks 绑定代表第一个参数默认是self logger = get_task_logger(__name__) @task(bind=True) def add(self, x, y): logger.info(self.request.id) 1.2 Task类继承 需要注意的是声明的位置,是在把方法修饰成Task类时声明。 @app.task(base=MyTask) def add(x, y): #raise KeyError return x + y 1.3 names 每个task实例都有一个非重复的名字,譬如下例: @app.task(name='tasks.mul') def mul(x, y): 一般不必要使用这一功能,特别是在task方法放在单独的module中时,默认name就是module name+方法名(celery_tasks.mul)。 尽量不要把任务模块命名为tasks.py,命名为celery_1.py更好一些。

Best way to map a generated list to a task in celery

风流意气都作罢 提交于 2019-11-30 15:12:17
I am looking for some advice as to the best way to map a list generated from a task to another task in celery. Let's say I have a task called parse , which parses a PDF document and outputs a list of pages. Each page then needs to be individually passed to another task called feed . This all needs to go inside a task called process So, one way I could do that is this: @celery.task def process: pages = parse.s(path_to_pdf).get() feed.map(pages) Of course, that is not a good idea because I am calling get() inside a task. Additionally this is inefficient, since my parse task is wrapped around a

Retrieving GroupResult from taskset_id in Celery?

主宰稳场 提交于 2019-11-30 13:13:39
I am starting a set of celery tasks by using celery group as described in the official documentation I am also storing the group (taskset) id into a db, in order to poll celery for the taskset state. job = group([ single_test.s(1, 1), single_test.s(1, 2), single_test.s(1, 3), ]) result = job.apply_async() test_set = MyTestSet() test_set.taskset_id = result.id # store test_set into DB Is there a way to obtain a GroupResult object (i.e. my result ) starting from the taskset id? Something like what is done in this question , but working with celery groups. I already tried doing: r = GroupResult

Setting up periodic tasks in Celery (celerybeat) dynamically using add_periodic_task

六月ゝ 毕业季﹏ 提交于 2019-11-30 13:03:17
I'm using Celery 4.0.1 with Django 1.10 and I have troubles scheduling tasks (running a task works fine). Here is the celery configuration: os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myapp.settings') app = Celery('myapp') app.autodiscover_tasks(lambda: settings.INSTALLED_APPS) app.conf.BROKER_URL = 'amqp://{}:{}@{}'.format(settings.AMQP_USER, settings.AMQP_PASSWORD, settings.AMQP_HOST) app.conf.CELERY_DEFAULT_EXCHANGE = 'myapp.celery' app.conf.CELERY_DEFAULT_QUEUE = 'myapp.celery_default' app.conf.CELERY_TASK_SERIALIZER = 'json' app.conf.CELERY_ACCEPT_CONTENT = ['json'] app.conf.CELERY

Django Celery tutorial not returning results

故事扮演 提交于 2019-11-30 12:14:01
UDATE3: found the issue. See the answer below. UPDATE2: It seems I might have been dealing with an automatic naming and relative imports problem by running the djcelery tutorial through the manage.py shell, see below. It is still not working for me, but now I get new log error messages. See below. UPDATE: I added the log at the bottom of the post. It seems the example task is not registered? Original Post: I am trying to get django-celery up and running. I was not able to get through the example. I installed rabbitmq succesfully and went through the tutorials without trouble: http://www

Using mock to patch a celery task in Django unit tests

三世轮回 提交于 2019-11-30 10:50:03
I'm trying to use the python mock library to patch a Celery task that is run when a model is saved in my django app, to see that it's being called correctly. Basically, the task is defined inside myapp.tasks , and is imported at the top of my models.py-file like so: from .tasks import mytask ...and then runs on save() inside the model using mytask.delay(foo, bar) . So far so good - works out fine when I'm actually running Celeryd etc. I want to construct a unit test that mocks the task, just to check that it gets called with the correct arguments, and doesn't actually try to run the Celery