celery

Celery stop execution of a chain

匿名 (未验证) 提交于 2019-12-03 02:14:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I have a check_orders task that's executed periodically. It makes a group of tasks so that I can time how long executing the tasks took, and perform something when they're all done (this is the purpose of res.join [1] and grouped_subs) The tasks that are grouped are pairs of chained tasks. What I want is for when the first task doesn't meet a condition (fails) don't execute the second task in the chain. I can't figure this out for the life of me and I feel this is pretty basic functionality for a job queue manager. When I try the things I

Celery and transaction.atomic

送分小仙女□ 提交于 2019-12-03 02:08:07
In some Django views, I used a pattern like this to save changes to a model, and then to do some asynchronous updating (such as generating images, further altering the model) based on the new model data. mytask is a celery task: with transaction.atomic(): mymodel.save() mytask.delay(mymodel.id).get() The problem is that the task never returns. Looking at celery's logs, the task gets queued (I see "Received task" in the log), but it never completes. If I move the mytask.delay...get call out of the transaction, it completes successfully. Is there some incompatibility between transaction.atomic

Python based asynchronous workflow modules : What is difference between celery workflow and luigi workflow?

独自空忆成欢 提交于 2019-12-03 02:06:56
I am using django as a web framework. I need a workflow engine that can do synchronous as well as asynchronous(batch tasks) chain of tasks. I found celery and luigi as batch processing workflow. My first question is what is the difference between these two modules. Luigi allows us to rerun failed chain of task and only failed sub-tasks get re-executed. What about celery: if we rerun the chain (after fixing failed sub-task code), will it rerun the already succeed sub-tasks? Suppose I have two sub-tasks. The first one creates some files and the second one reads those files. When I put these into

Python Celery - How to call celery tasks inside other task

匿名 (未验证) 提交于 2019-12-03 02:06:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I'm calling a task within a tasks in Django-Celery Here are my tasks. @shared_task def post_notification(data,url): url = "http://posttestserver.com/data/?dir=praful" # when in production, remove this line. headers = {'content-type': 'application/json'} requests.post(url, data=json.dumps(data), headers=headers) @shared_task def shipment_server(data,notification_type): notification_obj = Notification.objects.get(name = notification_type) server_list = ServerNotificationMapping.objects.filter(notification_name=notification_obj) for server in

Celery - Get task id for current task

匿名 (未验证) 提交于 2019-12-03 02:05:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: How can I get the task_id value for a task from within the task? Here's my code: from celery.decorators import task from django.core.cache import cache @task def do_job(path): "Performs an operation on a file" # ... Code to perform the operation ... cache.set(current_task_id, operation_results) The idea is that when I create a new instance of the task, I retrieve the task_id from the task object. I then use the task id to determine whether the task has completed. I don't want to keep track of the task by the path value because the file is

“OSError: dlopen(libSystem.dylib, 6): image not found” (OS X + macports + Celery 3.1.7)

匿名 (未验证) 提交于 2019-12-03 02:03:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I just updated celery via pip (1.5) to the latest version (3.1.7), but I get a fatal exception which I don't understand as soon I try to import the library. By running: from celery import Celery in the shell I get: File " ", line 1, in File "/Users/davidezanotti/CygoraPythonEnv/lib/python2.7/site-packages/celery/__init__.py", line 130, in from .five import recreate_module File "/Users/davidezanotti/CygoraPythonEnv/lib/python2.7/site-packages/celery/five.py", line 51, in from kombu.five import monotonic File "/Users/davidezanotti

Celery: is there a way to write custom JSON Encoder/Decoder?

匿名 (未验证) 提交于 2019-12-03 02:00:02
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I have some objects I want to send to celery tasks on my application. Those objects are obviously not json serializable using the default json library. Is there a way to make celery serialize/de-serialize those objects with custom JSON Encoder / Decoder ? 回答1: A bit late here, but you should be able to define a custom encoder and decoder by registering them in the kombu serializer registry, as in the docs: http://docs.celeryproject.org/en/latest/userguide/calling.html#serializers . For example, the following is a custom datetime serializer

Celery with RabbitMQ: AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for'

匿名 (未验证) 提交于 2019-12-03 01:57:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I'm running the First Steps with Celery Tutorial . We define the following task: from celery import Celery app = Celery('tasks', broker='amqp://guest@localhost//') @app.task def add(x, y): return x + y Then call it: >>> from tasks import add >>> add.delay(4, 4) But I get the following error: AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for' I'm running both the celery worker and the rabbit-mq server. Rather strangely, celery worker reports the task as succeeding: [2014-04-22 19:12:03,608: INFO/MainProcess] Task

Celery Received unregistered task of type (run example)

匿名 (未验证) 提交于 2019-12-03 01:47:02
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I'm trying to run example from Celery documentation. I run: celeryd --loglevel=INFO /usr/local/lib/python2.7/dist-packages/celery/loaders/default.py:64: NotConfigured: No 'celeryconfig' module found! Please make sure it exists and is available to Python. "is available to Python." % (configname, ))) [2012-03-19 04:26:34,899: WARNING/MainProcess] -------------- celery@ubuntu v2.5.1 ---- **** ----- --- * *** * -- [Configuration] -- * - **** --- . broker: amqp://guest@localhost:5672// - ** ---------- . loader: celery.loaders.default.Loader - **

Django and Celery, AppRegisteredNotReady exception

匿名 (未验证) 提交于 2019-12-03 01:45:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 由 翻译 强力驱动 问题: I'm trying to integrate Celery into my Django project. I've followed the Celery docs, and I can execute a simple Hello World task. But when I try to import my models into my task definitions, I am getting the AppRegisteredNotReady exception. I'm finding some older discussions around this exception, but nothing current. I'm probably missing something quite simple. Python 3.5, Django 1.9, Celery 3.1.23 Celery.py: from __future__ import absolute_import import os from celery import Celery from django . conf import settings os . environ