celery

Multi Celery projects with same RabbitMQ broker backend process

寵の児 提交于 2019-11-30 03:28:10
How can I use two different celery project which consumes messages from single RabbitMQ installation . Generally, these scripts work fine if I use different rabbitmq for them. But on production machine, I need to share the same RabbitMQ backend for them. Note: Due to some constraint, I cannot merge new projects in existing, so it will be two different project. mher RabbitMQ has the ability to create virtual message brokers called virtual hosts or vhosts. Each one is essentially a mini-RabbitMQ server with its own queues. This lets you safely use one RabbitMQ server for multiple applications.

Make Celery use Django's test database without task_always_eager

笑着哭i 提交于 2019-11-30 02:39:26
问题 When running tests in Django applications that make use of Celery tasks I can't fully test tasks that need to get data from the database since they don't connect to the test database that Django creates. Setting task_always_eager in Celery to True partially solves this problem but as the documentation for testing says, this doesn't fully reflect how the code will run on a real Celery worker and isn't suitable for testing. How can I make Celery tasks use the Django test database when running

Celery creating a new connection for each task

生来就可爱ヽ(ⅴ<●) 提交于 2019-11-30 01:44:50
I'm using Celery with Redis to run some background tasks, but each time a task is called, it creates a new connection to Redis. I'm on Heroku and my Redis to Go plan allows for 10 connections. I'm quickly hitting that limit and getting a "max number of clients reached" error. How can I ensure that Celery queues the tasks on a single connection rather than opening a new one each time? EDIT - including the full traceback File "/app/.heroku/venv/lib/python2.7/site-packages/django/core/handlers/base.py", line 111, in get_response response = callback(request, *callback_args, **callback_kwargs) File

Python Web:Flask异步执行任务

拜拜、爱过 提交于 2019-11-30 00:56:52
Flask 是 Python 中有名的轻量级同步 web 框架,在一些开发中,可能会遇到需要长时间处理的任务,此时就需要使用异步的方式来实现,让长时间任务在后台运行,先将本次请求的响应状态返回给前端,不让前端界面「卡顿」,当异步任务处理好后,如果需要返回状态,再将状态返回。 怎么实现呢? 使用线程的方式 当要执行耗时任务时,直接开启一个新的线程来执行任务,这种方式最为简单快速。 通过 ThreadPoolExecutor 来实现 from flask import Flask from time import sleep from concurrent.futures import ThreadPoolExecutor # DOCS https://docs.python.org/3/library/concurrent.futures.html#concurrent.futures.ThreadPoolExecutor # 创建线程池执行器 executor = ThreadPoolExecutor(2) app = Flask(__name__) @app.route('/jobs') def run_jobs(): # 交由线程去执行耗时任务 executor.submit(long_task, 'hello', 123) return 'long task running.

How to put rate limit on a celery queue?

女生的网名这么多〃 提交于 2019-11-30 00:30:48
问题 I read this in celery documentation : Task.rate_limit http://celery.readthedocs.org/en/latest/userguide/tasks.html#Task.rate_limit Note that this is a per worker instance rate limit, and not a global rate limit. To enforce a global rate limit (e.g. for an API with a maximum number of requests per second), you must restrict to a given queue. How to put rate limit on celery queue? Thanks for not down voting the question. 回答1: Turns out it cant be done at queue level for multiple workers. IT can

python-celery使用教程

笑着哭i 提交于 2019-11-29 23:51:06
Celery Celery是Python开发的分布式任务调度模块。分为任务分发,任务队列,worker3个部分。celery的出现,解决了python运行后台任务的需求。 这篇文章介绍的celery版本是3.1.18 celery架构 +------------------+ +------> | celery worker.1 | +-----------------+ +-----------------------+ | +------------------+ | web service +-----> | job queue(redis or ..)+----+ +-----------------+ +-----------------------+ | +------------------+ +------> | celery worker.2 | | +------------------+ | | +------------------+ +------> | celery worker.[n]| +------------------+ 任务队列,支持如redis,RabbitMQ甚至数据库。通常redis是最好的选择,不过数据库在本地使用的时候,也是不错的。 安装celery 使用douban的pypi镜像,安装会快一点。 pip install -i

Python 并行分布式框架:Celery

末鹿安然 提交于 2019-11-29 23:50:44
Celery (芹菜)是基于Python开发的分布式任务队列。它支持使用任务队列的方式在分布的机器/进程/线程上执行任务调度。 一、架构设计 Celery的架构由三部分组成,消息中间件(message broker),任务执行单元(worker)和任务执行结果存储(task result store)组成。 消息中间件 Celery本身不提供消息服务,但是可以方便的和第三方提供的消息中间件集成。包括, RabbitMQ , Redis , MongoDB (experimental), Amazon SQS (experimental), CouchDB (experimental), SQLAlchemy (experimental),Django ORM (experimental), IronMQ 任务执行单元 Worker是Celery提供的任务执行的单元,worker并发的运行在分布式的系统节点中。 任务结果存储 Task result store用来存储Worker执行的任务的结果,Celery支持以不同方式存储任务的结果,包括AMQP, Redis,memcached, MongoDB,SQLAlchemy, Django ORM,Apache Cassandra, IronCache 另外, Celery还支持不同的并发和序列化的手段 并发 Prefork ,

基于Celery的并行处理工程-OpenWorker

空扰寡人 提交于 2019-11-29 23:49:21
OpenWorker-初始建立 (2015-05-29),欢迎参与: https://github.com/supergis/OpenWorker 。 OpenWorker-基于Python的并行处理框架,将集成Celery、Flower、Jobtastic和Rodeo工程,可以通过控制台或Web进行管理、提交任务等。 Celery是一个简单灵活的Python并行处理框架,但是相关的几个工程需要独自安装和配置,给小白的使用带来困难。OpenWorker将这几个工程放到一起,并增加了统一的安装脚本,让部署和安装、运行都更加方便。OpenWorker仅仅集成这些资源让数据研究者更易于使用,而不是替代原来的工程。由于这几个项目还在快速发展,因此也编写合并和更新的脚本,实现与原作者的代码库保持一致。 1、并行处理框架: Celery: http://www.celeryproject.org/ 执行任务的分发和调度,使用消息总线进行通讯。 关于Celery的入门教程及参考: http://my.oschina.net/u/2306127/blog?catalog=2527511 2、Web管理控制台: Flower: https://github.com/mher/flower 在远程通过Web界面监视和管理任务执行情况。 3、任务进度通知: Jobtastic: http:/

基于Celery的并行处理工程-OpenWorker快速安装

纵饮孤独 提交于 2019-11-29 23:49:08
Celery 是一个简单灵活的Python并行处理框架,但是相关的几个工程需要独自安装和配置,给小白的使用带来困难。 OpenWorker是基于Python的并行处理框架,将集成Celery、Flower、Jobtastic和Rodeo工程,可以通过控制台或Web进行管理、提交任务等。 OpenWorker将这几个工程放到一起, 并增加了统一的安装脚本,让部署和安装、运行都更加方便。OpenWorker仅仅集成这些资源让数据研究者更易于使用,而不是替代原来的工程。由于这几个项目还在快速发展,因此也编写合并和更新的脚本,实现与原作者的代码库保持一致。 1、并行处理框架:Celery, http://www.celeryproject.org/ 执行任务的分发和调度,使用消息总线进行通讯。 关于Celery的入门教程及参考: http://my.oschina.net/u/2306127/blog/420833 2、Web管理控制台:Flower, https://github.com/mher/flower 在远程通过Web界面监视和管理任务执行情况。 3、任务进度通知:Jobtastic, http://policystat.github.io/jobtastic/ 为长时间运行的任务提供进度通知的Celery扩展库。 4、Web上Python控制台: Rodeo: https:/

Python task queue alternatives and frameworks [closed]

风流意气都作罢 提交于 2019-11-29 23:31:55
There seem to be different implementations of task/job queues for Python 3: Celery , popular but apparently unmaintained and stale; RQ , of which I have little information; TaskTiger , similarly to RQ I know little about it; Huey , similarly to RQ I know little about it; WorQ had its last update in 2016. Then there are “cloud” based solutions like Google’s Task Queue API or AWS’s Cloud Watch Events , but that’s more of a last resort. For my project I am looking for a stable and active task queue implementation. I’ve used Celery for the past year, but the lack of support and non-attention to