celery

Broadcast messages in celery

匿名 (未验证) 提交于 2019-12-03 08:46:08
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I'm using celery and want to send broadcast task to couple of workers. I'm trying to do it like is described on http://docs.celeryproject.org/en/latest/userguide/routing.html#broadcast so I create simple app with task: @celery.task def do_something(value): print value and in app I made: from kombu.common import Broadcast CELERY_QUEUES = (Broadcast('broadcast_tasks'), ) CELERY_ROUTES = {'my_app.do_something': {'queue': 'broadcast_tasks'}} and then I was trying to send task to workers with: my_app.do_something.apply_async(['222'], queue=

Shared XMPP connection between Celery workers

匿名 (未验证) 提交于 2019-12-03 08:44:33
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: My web app needs to be able to send XMPP messages (Facebook Chat), and I thought Celery might be a good solution for this. A task would consist of querying the database and sending the XMPP message to a number of users. However, with that approach I would have to connect to the XMPP server every time I run a task, which is not a great idea. From the Facebook Chat API docs : Best Practices Your Facebook Chat integration should only be used for sessions that are expected to be long-lived. Clients should not rapidly churn on and off. Is there a

Connect new celery periodic task in django

匿名 (未验证) 提交于 2019-12-03 08:44:33
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: It's not a question but help to those who will find that the declaration of periodic tasks described in celery 4.0.1 documentation is hard to integrate in django: http://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html#entries copy paste celery config file main_app/celery.py : from celery import Celery from celery.schedules import crontab app = Celery() @app.on_after_configure.connect def setup_periodic_tasks(sender, **kwargs): # Calls test('hello') every 10 seconds. sender.add_periodic_task(10.0, test.s('hello'), name='add

celery: daemonic processes are not allowed to have children

匿名 (未验证) 提交于 2019-12-03 08:42:37
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: In Python (2.7) I try to create processes (with multiprocessing) in a celery task (celery 3.1.17) but it gives the error: daemonic processes are not allowed to have children Googling it, I found that most recent versions of billiard fix the "bug" but I have the most recent version (3.3.0.20) and the error is still happening. I also tried to implement this workaround in my celery task but it gives the same error. Does anybody know how to do it? Any help is appreciated, Patrick EDIT: snippets of code Task: from __future__ import absolute

Celery worker and command line args

匿名 (未验证) 提交于 2019-12-03 08:41:19
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I am refactoring my code to use celery worker. Before I used to use argparse to pass command line args. e.g. if __name__ == "__main__": parser = argparse.ArgumentParser(description='Node') parser.add_argument('--environment', action="store", default='local', help="env e.g. production of development") environment = arg_options.environment But now I get this error. celery -A tasks worker --loglevel=info --environment local celery: error: no such option: --environment How can I add? I don't want to use environment variable if I don't have to. e

python celery - ImportError: No module named _curses - while attempting to run manage.py celeryev

匿名 (未验证) 提交于 2019-12-03 08:36:05
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: Background Windows 7 x 64 Python 2.7 Django 1.4 Celery with Redis bundle While trying to run manage.py celeryev, I get the following error in the terminal import curses File 'c:\Python2\lib\curses\__init__.py', line 15, in <module> from _curses import * ImportError: No module named _curses I've tried looking at other posts, but haven't been able to solve this problem. Any thoughts on what is causing this error? Thanks in advance. 回答1: According to http://docs.python.org/library/curses.html the curses module is only supported on Unix

celery框架

99封情书 提交于 2019-12-03 08:22:36
Celery架构 Celery架构由三部分组成,消息中间件(message broker) , 任务执行单元(worker) 和任务执行结果储存(backend-task result store)组成 安装的celery主体模块,默认只提供worker,要结合其他技术提供broker和backend(两个存储的单位) 消息中间件 Celery本身不提供消息服务,但是可以方便的和第三方提供的消息中间件集成。包括,RabbitMQ, Redis等等 任务执行单元 Worker是Celery提供的任务执行的单元,worker并发的运行在分布式的系统节点中。 任务结果存储 Task result store用来存储Worker执行的任务的结果,Celery支持以不同方式存储任务的结果,包括AMQP, redis等 使用场景 异步任务:将耗时操作任务提交给Celery去异步执行,比如发送短信/邮件、消息推送、音视频处理等等 定时任务:定时执行某件事情,比如每天数据统计 Celery的安装配置 pip install celery 消息中间件:RabbitMQ/Redis app = celery.Celery('任务名',broker='xxx',backend='xxx',include=['xxx','xxx']) Celery执行异步任务 包架构封装 project ├──

How to get all tasks and periodic tasks in Celery [duplicate]

佐手、 提交于 2019-12-03 08:22:22
This question already has answers here : How to find all the subclasses of a class given its name? (9 answers) Possible Duplicate: How can I find all subclasses of a given class in Python? In my Django project, I have some subclass of Celery's Task and PeriodicTask : class CustomTask(Task): # stuff class CustomPeriodicTask(PeriodicTask): # stuff I need all Task classes to add some custom logging configuration. So I thought I can us __subclasses__ , but this does not work: >>> Task.__subclasses__() [<unbound PeriodicTask>, <class handle_register of <Celery default:0xa1cc3cc>>] Is it somehow

Monitoring Celery, what should I use? [closed]

余生颓废 提交于 2019-12-03 07:50:59
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 2 years ago . I'm using Django, Celery, and Django-Celery. I'd like to monitor the state/results of my tasks, but I'm a little confused on how to do that. Do I use ./manage.py celeryev , ./manage.py celerymon , ./manage.py celerycam ? Do I run sudo /etc/init.d/celeryevcam start ? 回答1: Run: ./manage.py celeryd -E ./manage.py

How to access the orm with celery tasks?

匿名 (未验证) 提交于 2019-12-03 07:50:05
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I'm trying to flip a boolean flag for particular types of objects in my database using sqlalchemy+celery beats. But how do I access my orm from the tasks.py file? from models import Book from celery.decorators import periodic_task from application import create_celery_app celery = create_celery_app() # Create celery: http://flask.pocoo.org/docs/0.10/patterns/celery/ # This task works fine @celery.task def celery_send_email(to,subject,template): with current_app.app_context(): msg = Message( subject, recipients=[to], html=template, sender