celery

Celery worker's log contains question marks (???) instead of correct unicode characters

自古美人都是妖i 提交于 2019-12-08 09:10:37
问题 I'm using Celery 3.1.18 with Python 2.7.8 on CentOS 6.5. In a Celery task module, I have the following code: # someapp/tasks.py from celery import shared_task from celery.utils.log import get_task_logger logger = get_task_logger(__name__) @shared_task() def foo(): logger.info('Test output: %s', u"测试中") I use the initd script here to run a Celery worker. Also I put the following settings in /etc/default/celeryd : CELERYD_NODES="bar" # %N will be replaced with the first part of the nodename.

python virtualenv ImportError with celery and billiard

♀尐吖头ヾ 提交于 2019-12-08 08:03:42
问题 i am building a new amazon instance with python 2.7.10 as default. after i have ran my machine provisioning scripts and the moment of truth arrives, celery gave me an import so i debugged the problem to billard. the package appears to be in the correct path i.e. sudo find -name "billiard" ./srv/ia-live/lib64/python2.7/dist-packages/billiard where ia-live is the path of my virtualenv. checking the path in via python virtualenv executable import sys sys.path ['', '/srv/ia-live/bin', '/srv/ia

[[Django Celery]] Celery blocked doing IO tasks

三世轮回 提交于 2019-12-08 05:28:45
问题 I use celery to do some IO tasks, such as grab remote image, sending email to users. But celery sometimes blocked with no logs. At this time, it won't do any task i send. I have to restart it, it begin to work where it blocked. It puzzles me for a very long time. What can i do ? And what is the best practice for distributing IO tasks with celery? 回答1: By default, celery worker fork several processes waiting for tasks request from client. For the tasks of IO pending and your system need a

Asynchronous task queue processing of in-memory data stucture in Django

*爱你&永不变心* 提交于 2019-12-08 05:00:43
问题 I have a singleton in-memory data-structure inside my Django project (some kind of kd-tree that needs to get accessed all across the project). For those that don't know Django, I believe the same issue would appear with a regular Python code. I know it's evil (Singleton), and I'm looking for better ways to implement that, but my question here is related to another topic: I am instantiating the singleton inside my code by calling Singleton.instance() and it gives me the object correctly, it

How to make Celery worker return results from task

此生再无相见时 提交于 2019-12-08 04:39:18
问题 I have a flask app which calls a task. The task pulls data from database, plots line charts and returns html contents which is rendered on an html page. Without Celery the Flask app works fine and renders line chart on client side, but now I want to delegated celery to run task via RabbitMQ broker, which it runs as I can see the log outputs in Celery shell, but the resulting html contents never gets sent back to flask server app. How to do that? This is a follow up to http://stackoverflow.com

Tornado IOLoop Exception in callback None in Celery worker

徘徊边缘 提交于 2019-12-08 04:13:47
问题 I am using tornado.ioloop inside celery worker because I need to use mongodb. class WorkerBase(): @gen.engine def foo(self,args,callback) bar = ['Python','Celery','Javascript','HTML'] # ... process something .... callback(bar) @gen.engine def RunMyTask(self,args): result = yield gen.Task(self.foo,args=args) # Stop IOLoop instance IOLoop.instance().stop() @task(name="MyWorker",base=WorkerBase) def CeleryWorker(args): # This works because i'm adding base as WorkerBase CeleryWorker.RunMyTask

Celery task function custom attributes

巧了我就是萌 提交于 2019-12-08 02:43:53
问题 I have a celery task function that looks like this- @task(base=MyBaseTask) @my_custom_decorator def my_task(*args, **kwargs): my_task.ltc.some_func() #fails - attribute ltc doesn't exist on the object and my_custom_decorator looks like this def my_custom_decorator (f): from functools import wraps ltc = SomeClass() @wraps(f) def _inner(*args, **kwargs): ret_obj = None try: f.task_cache = ltc ret_obj = f(*args, **kwargs) except Exception, e: raise return ret_obj _inner.ltc = ltc return _inner I

Python Celery - lookup task by pid

不打扰是莪最后的温柔 提交于 2019-12-08 02:23:36
问题 A pretty straightforward question, maybe - I often see a celery task process running on my system that I cannot find when I use celery.task.control.inspect() 's active() method. Often this process will be running for hours, and I worry that it's a zombie of some sort. Usually it's using up a lot of memory, too. Is there a way to look up a task by linux pid? Does celery or the AMPQ result backend save that? If not, any other way to figure out which particular task is the one that's sitting

mysql command out of sync when executing insert from celery

≡放荡痞女 提交于 2019-12-07 19:17:07
问题 I am running in to the dreaded MySQL Commands out of Sync when using a custom DB library and celery. The library is as follows: import pymysql import pymysql.cursors from furl import furl from flask import current_app class LegacyDB: """Db Legacy Database connectivity library """ def __init__(self,app): with app.app_context(): self.rc = current_app.config['RAVEN'] self.logger = current_app.logger self.data = {} # setup Mysql try: uri = furl(current_app.config['DBCX']) self.dbcx = pymysql

Celery

时光怂恿深爱的人放手 提交于 2019-12-07 17:28:33
Celery 官方 Celery 官网:http://www.celeryproject.org/ Celery 官方文档英文版:http://docs.celeryproject.org/en/latest/index.html Celery 官方文档中文版:http://docs.jinkan.org/docs/celery/ Celery架构 Celery的架构由三部分组成,消息中间件(message broker)、任务执行单元(worker)和 任务执行结果存储(task result store)组成。 消息中间件 Celery本身不提供消息服务,但是可以方便的和第三方提供的消息中间件集成。包括,RabbitMQ, Redis等等 任务执行单元 Worker是Celery提供的任务执行的单元,worker并发的运行在分布式的系统节点中。 任务结果存储 Task result store用来存储Worker执行的任务的结果,Celery支持以不同方式存储任务的结果,包括AMQP, redis等 使用场景 异步任务:将耗时操作任务提交给Celery去异步执行,比如发送短信/邮件、消息推送、音视频处理等等 定时任务:定时执行某件事情,比如每天数据统计 Celery的安装配置 pip install celery 消息中间件:RabbitMQ/Redis app=Celery(