celery

Celery

夙愿已清 提交于 2019-11-30 10:32:14
celery 1.什么是Celery Celery是一个简单,灵活且可靠的处理大量消息的分布式系统,煮煮鱼实时处理的一步任务队列,同事也支持任务调度。 2.Celery架构 Celery由三部分组成: ​ 消息中间件(message broker), ​ 任务执行单元(worker, ​ 任务执行结果存储(task result store) 2.1 消息中间件(message broker) Celery本身不提供消息服务,但是可以方便和第三方提供的戏哦阿西中间件集成。包括rabbitMQ、redis等等。 2.2 任务执行单元(worker) Worker是Celery提供的任务执行的单元,worker并发的运行在分布式的系统节点中。 2.3 任务结果存储(task result store) Task result store用来存储Worker执行的任务的结果,Celery支持以不同方式存储任务的结果,包括redis等。 3.使用场景 异步任务:将好事操作任务提交给Celery去异步执行,比如发送短信/邮件,消息推送,音频视频处理等等。 定时任务:定时执行某件事情,比如每天数据统计。 4.Celery执行异步任务 4.1 安装配置 # 安装celery组件 pip install celery # 消息中间件 RabbitMQ/Redis 4.2 基本使用

Python Celery - How to call celery tasks inside other task

混江龙づ霸主 提交于 2019-11-30 10:08:13
I'm calling a task within a tasks in Django-Celery Here are my tasks. @shared_task def post_notification(data,url): url = "http://posttestserver.com/data/?dir=praful" # when in production, remove this line. headers = {'content-type': 'application/json'} requests.post(url, data=json.dumps(data), headers=headers) @shared_task def shipment_server(data,notification_type): notification_obj = Notification.objects.get(name = notification_type) server_list = ServerNotificationMapping.objects.filter(notification_name=notification_obj) for server in server_list: task = post_notification.delay(data

django celery: how to set task to run at specific interval programmatically

孤人 提交于 2019-11-30 09:59:01
I found that I can set the task to run at specific interval at specific times from here , but that was only done during task declaration. How do I set a task to run periodically dynamically? The schedule is derived from a setting , and thus seems to be immutable at runtime. You can probably accomplish what you're looking for using Task ETAs . This guarantees that your task won't run before the desired time, but doesn't promise to run the task at the designated time—if the workers are overloaded at the designated ETA, the task may run later. If that restriction isn't an issue, you could write a

How to list the queued items in celery?

放肆的年华 提交于 2019-11-30 09:02:41
问题 I have a Django project on an Ubuntu EC2 node, which I have been using to set up an asynchronous using Celery . I am following http://michal.karzynski.pl/blog/2014/05/18/setting-up-an-asynchronous-task-queue-for-django-using-celery-redis/ along with the docs. I've been able to get a basic task working at the command line, using: (env1)ubuntu@ip-172-31-22-65:~/projects/tp$ celery --app=myproject.celery:app worker --loglevel=INFO I just realized, that I have a bunch of tasks in my queue, that

Django Celery implementation - OSError : [Errno 38] Function not implemented

孤街浪徒 提交于 2019-11-30 08:59:51
I installed django-celery and I tried to start up the worker server but I get an OSError that a function isn't implemented. I'm running CentOS release 5.4 (Final) on a VPS: . broker -> amqp://guest@localhost:5672/ . queues -> . celery -> exchange:celery (direct) binding:celery . concurrency -> 4 . loader -> djcelery.loaders.DjangoLoader . logfile -> [stderr]@WARNING . events -> OFF . beat -> OFF [2010-07-22 17:10:01,364: WARNING/MainProcess] Traceback (most recent call last): [2010-07-22 17:10:01,364: WARNING/MainProcess] File "manage.py", line 11, in <module> [2010-07-22 17:10:01,364: WARNING

Django Celery get task count

余生颓废 提交于 2019-11-30 08:45:14
I am currently using django with celery and everything works fine. However I want to be able to give the users an opportunity to cancel a task if the server is overloaded by checking how many tasks are currently scheduled. How can I achieve this ? I am using redis as broker. I just found this : Retrieve list of tasks in a queue in Celery It is somehow relate to my issue but I don't need to list the tasks , just count them :) If your broker is configured as redis://localhost:6379/1 , and your tasks are submitted to the general celery queue, then you can get the length by the following means:

Celery does not release memory

痞子三分冷 提交于 2019-11-30 08:43:28
It looks like celery does not release memory after task finished. Every time a task finishes, there would be 5m-10m memory leak. So with thousands of tasks, soon it will use up all memory. BROKER_URL = 'amqp://user@localhost:5672/vhost' # CELERY_RESULT_BACKEND = 'amqp://user@localhost:5672/vhost' CELERY_IMPORTS = ( 'tasks.tasks', ) CELERY_IGNORE_RESULT = True CELERY_DISABLE_RATE_LIMITS = True # CELERY_ACKS_LATE = True CELERY_TASK_RESULT_EXPIRES = 3600 # maximum time for a task to execute CELERYD_TASK_TIME_LIMIT = 600 CELERY_DEFAULT_ROUTING_KEY = "default" CELERY_DEFAULT_QUEUE = 'default'

Retrying celery failed tasks that are part of a chain

走远了吗. 提交于 2019-11-30 08:08:44
问题 I have a celery chain that runs some tasks. Each of the tasks can fail and be retried. Please see below for a quick example: from celery import task @task(ignore_result=True) def add(x, y, fail=True): try: if fail: raise Exception('Ugly exception.') print '%d + %d = %d' % (x, y, x+y) except Exception as e: raise add.retry(args=(x, y, False), exc=e, countdown=10) @task(ignore_result=True) def mul(x, y): print '%d * %d = %d' % (x, y, x*y) and the chain: from celery.canvas import chain chain(add

Celery Task Chain and Accessing **kwargs

十年热恋 提交于 2019-11-30 07:06:32
I have a situation similar to the one outlined here , except that instead of chaining tasks with multiple arguments, I want to chain tasks that return a dictionary with multiple entries. This is -- very loosely and abstractly --- what I'm trying to do: tasks.py @task() def task1(item1=None, item2=None): item3 = #do some stuff with item1 and item2 to yield item3 return_object = dict(item1=item1, item2=item2, item3=item3) return return_object def task2(item1=None, item2=None, item3=None): item4 = #do something with item1, item2, item3 to yield item4 return_object = dict(item1=item1, item2=item2,

Retrieve a task result object, given a `task_id` in Celery

北城以北 提交于 2019-11-30 07:02:55
问题 I store the task_id from an celery.result.AsyncResult in a database and relate it to the item that the task affects. This allows me to perform a query to retrieve all the task_id s of tasks that relate to a specific item. So after retrieving the task_id from the database, how do I go about retrieving information about the task's state/result/etc? 回答1: From the Celery FAQ: result = MyTask.AsyncResult(task_id) result.get() 来源: https://stackoverflow.com/questions/5544611/retrieve-a-task-result