celery

How can you catch a custom exception from Celery worker, or stop it being prefixed with `celery.backends.base`?

大憨熊 提交于 2019-12-03 13:51:43
My Celery task raises a custom exception NonTransientProcessingError , which is then caught by AsyncResult.get() . Tasks.py: class NonTransientProcessingError(Exception): pass @shared_task() def throw_exception(): raise NonTransientProcessingError('Error raised by POC model for test purposes') In the Python console: from my_app.tasks import * r = throw_exception.apply_async() try: r.get() except NonTransientProcessingError as e: print('caught NonTrans in type specific except clause') But my custom exception is my_app.tasks. NonTransientProcessingError , whereas the exception raised by

Celery在Django中的使用介绍

守給你的承諾、 提交于 2019-12-03 13:47:27
Celery在Django中的使用介绍 Celery简介 celery是一个简单、灵活且可靠的,处理大量消息的分布式系统,并且提供维护这样一个系统的必须工具。 它是一个专注于实时处理的任务队列,同时也支持任务调度。 何为任务队列 任务队列:是一种在线程和机器间分发任务的机制。 celery的三大组成部分 worker 任务执行单元-->Worker是Celery提供的任务执行的单元,worker并发的运行在分布式的系统节点中。 broker(存tasks的仓库) 消息中间件--> Celery本身不提供消息服务,但是可以方便的和第三方提供的消息中间件集成。包括,RabbitMQ, Redis等等 backend (存results的仓库) ask result store用来存储Worker执行的任务的结果,Celery支持以不同方式存储任务的结果,包括AMQP, redis等 使用场景 异步任务:将耗时操作任务提交给celery去异步执行,比如发送短信、邮件,消息推送、音视频处理等。 定时任务:定时执行某件事情,比如每天数据统计 基本命令 # 1. 启动celery服务: # 非windows: # 指令:celery worker -A celery_task(celery项目文件) -l info # windows: 需要先下载eventlet模块,pip install

Task priority in celery with redis

感情迁移 提交于 2019-12-03 13:44:22
I would like to implement a distributed job execution system with celery. Given that rabbitMQ doesn't support priorities and I'm painfully needing this feature, I turned to celery+redis. In my situation, the tasks are closely related to hardware, for example, task A could only run on Worker 1 since only the PC of Worker 1 has got the necessary hardware. I set the CONCURRENCY of each worker to 1 so that a worker will only run one task each time. Each task takes about 2 minites. To implement the priority feature, first of all I tried adding priority argument when calling apply_async() , for

Read Celery configuration from Python properties file

夙愿已清 提交于 2019-12-03 13:26:56
I have an application that needs to initialize Celery and other things (e.g. database). I would like to have a .ini file that would contain the applications configuration. This should be passed to the application at runtime. development.init: [celery] broker=amqp://localhost/ backend=amqp://localhost/ task.result.expires=3600 [database] # database config # ... celeryconfig.py: from celery import Celery import ConfigParser config = ConfigParser.RawConfigParser() config.read(...) # Pass this from the command line somehow celery = Celery('myproject.celery', broker=config.get('celery', 'broker'),

Decorator after @task decorator in celery

こ雲淡風輕ζ 提交于 2019-12-03 13:16:57
I'm trying to apply a decorator after the celery @task decorator, something like. @send_email @task def any_function(): print "inside the function" I can get it to work in the way it is recommended in the docs, i.e. to put the decorator before the task decorator, but in this case I would like to access the task instance in my decorator. The @send_email would have to be a class decorator, this is what I tried without success: class send_email(object): ''' wraps a Task celery class ''' def __init__(self, obj): self.wrapped_obj = obj functools.update_wrapper(self, obj) def __call__(self, *args, *

How can I view the enqueued tasks in RabbitMQ?

試著忘記壹切 提交于 2019-12-03 13:07:06
I'm using RabbitMQ as my message broker and my workers are Celery tasks. I'm trying to diagnose an issue where I'm enqueue tasks to RabbitMQ but Celery doesn't pick then up. Is there a way I can check what tasks are enqueued in RabbitMQ? I'd like to see the date and time when they are enqueued, any ETA is specified, the arguments and the task name. I haven't been able to find this information in the docs — maybe I've overlooked it — and was hoping that some of you might know an easy way to inspect the task queue. Thanks. You can use Flower to monitor tasks in real time. https://github.com/mher

Django Celery ConnectionError: Too many heartbeats missed

六眼飞鱼酱① 提交于 2019-12-03 13:03:56
问题 Question How can I solve the ConnectionError: Too many heartbeats missed from Celery? Example Error [2013-02-11 15:15:38,513: ERROR/MainProcess] Error in timer: ConnectionError('Too many heartbeats missed', None, None, None, '') Traceback (most recent call last): File "/app/.heroku/python/lib/python2.7/site-packages/celery/utils/timer2.py", line 97, in apply_entry entry() File "/app/.heroku/python/lib/python2.7/site-packages/celery/utils/timer2.py", line 51, in __call__ return self.fun(*self

SQLAlchemy session issues with celery

耗尽温柔 提交于 2019-12-03 12:45:41
问题 I have scheduled a few recurring tasks with celery beat for our web app The app itself is build using pyramid web framework. Using the zopetransaction extension to manage session In celery, I am using the app as a library. I am redefining session in models with a function. It works well but once in a while, it raises InvalidRequestError: This session is in 'prepared' state; no further SQL can be emitted within this transaction I am not sure what is wrong and why it issues these warnings.

celery task clean-up with DB backend

杀马特。学长 韩版系。学妹 提交于 2019-12-03 12:44:14
I'm trying to understand how and when tasks are cleaned up in celery. From looking at the task docs I see that: Old results will be cleaned automatically, based on the CELERY_TASK_RESULT_EXPIRES setting. By default this is set to expire after 1 day: if you have a very busy cluster you should lower this value. But this quote is from the RabbitMQ Result Backend section and I do not see any similar text in the Database Backend section. So my question is: is there a backend agnostic approach I can take for old task clean-up with celery and if not is there a DB Backend specific approach I should

Initializing a worker with arguments using Celery

China☆狼群 提交于 2019-12-03 12:42:55
问题 I'm having issues finding something that seems like it would be relatively simple to me. I'm using Celery 3.1 with Python 3 and am wanting to initialize my workers with arguments so that they can use these details for setup. In specific: These workers will be consuming tasks which require interacting with a third-party API using authentication credentials. It's necessary for the worker to pass the authentication details to API server prior to consuming any tasks (authentication details are