celery

Failure to start celeryd - Error : conflicting option string(s): --no-color

别说谁变了你拦得住时间么 提交于 2019-12-10 22:03:58
问题 I'm using django v1.7.0b4 and celery v3.1.1. Followed steps according to the django installation guide. But I'm stuck with the below error. $ ./manage.py celeryd --help Starting server in DEVELOPMENT Mode Traceback (most recent call last): File "./manage.py", line 10, in <module> execute_from_command_line(sys.argv) File "/Library/Python/2.7/site-packages/django/core/management/__init__.py", line 427, in execute_from_command_line utility.execute() File "/Library/Python/2.7/site-packages/django

ImportError: No module named celery for Celery 3.1 and Python 2.7

馋奶兔 提交于 2019-12-10 22:02:32
问题 Using Python 2.7 and Celery 3.1.25 on Windows, when we run the Celery worker using celery -A proj worker -l info we get the error ImportError: No module named celery Problem: The worker stops working when we changed the name of the file celeryApp.py from celery.py changed the import statement in tasks.py from from .celery import app to from celeryApp import app . Why is this happening? How can we fix the problem? Directory structure /proj/__init__.py /proj/celeryApp.py /proj/tasks.py /proj

How to configure and run celerybeat

元气小坏坏 提交于 2019-12-10 21:42:59
问题 I am just getting started with celery,trying to run a periodic task. Configured *rabbitmq** added celeryconfig.py. And added following code in tasks.py: from celery.decorators import periodic_task from datetime import timedelta @periodic_task(run_every=timedelta(seconds=2)) def every_2_seconds(): print("Running periodic task!") Now when I start celerybeat by typing "celerybeat" in my terminal it starts to run with follwing message celerybeat celerybeat v3.0.3 (Chiastic Slide) is starting. __

Celery Error 'No such transport: amqp'

杀马特。学长 韩版系。学妹 提交于 2019-12-10 21:34:22
问题 Celery was working fine, one day the command-line worker failed to start up with the following trace: Traceback (most recent call last): File "/home/buildslave/venv/bin/celery", line 9, in <module> load_entry_point('celery==3.0.7', 'console_scripts', 'celery')() File "/home/buildslave/venv/local/lib/python2.7/site-packages/celery/__main__.py", line 14, in main main() File "/home/buildslave/venv/local/lib/python2.7/site-packages/celery/bin/celery.py", line 942, in main cmd.execute_from

Django Celery Memory not release

佐手、 提交于 2019-12-10 20:39:11
问题 In my django project I have the following dependencies: django==1.5.4 django-celery==3.1.9 amqp==1.4.3 kombu==3.0.14 librabbitmq==1.0.3 (as suggested by https://stackoverflow.com/a/17541942/1452356) In dev_settings.py: DEBUG = False BROKER_URL = "django://" import djcelery djcelery.setup_loader() CELERYBEAT_SCHEDULER = "djcelery.schedulers.DatabaseScheduler" CELERYD_CONCURRENCY = 2 # CELERYD_TASK_TIME_LIMIT = 10 CELERYD_TASK_TIME_LIMIT is commented as suggested here https://stackoverflow.com

Have Celery broadcast return results from all workers

天大地大妈咪最大 提交于 2019-12-10 19:31:41
问题 Is there a way to get all the results from every worker on a Celery Broadcast task? I would like to monitor if everything went ok on all the workers. A list of workers that the task was send to would also be appreciated. 回答1: No, that is not easily possible. But you don't have to limit yourself to the built-in amqp result backend, you can send your own results using Kombu (http://kombu.readthedocs.org), which is the messaging library used by Celery: from celery import Celery from kombu import

Passing an image to a celery task

我们两清 提交于 2019-12-10 19:02:17
问题 Part of an application I'm writing allows for users to upload images which I then resize and automatically upload to amazon s3. Currently the image resizing is happening right in the view and I'd like to offload this via celery to distributed workers. My question is whats the best way to get the image to the worker. My current thinking is to store the image directly in the database and then just pass the id to the worker and have it retrieve it. Is there a better practice then temporarily

Accessing celery worker instance inside the task

大城市里の小女人 提交于 2019-12-10 18:26:49
问题 I want to use jupyter kernels in side the celery worker. There will be one Jupyter Kernel for each Celery Worker. To achieve it I am overriding the default Worker class of the celery, at the initialisation of the worker I am starting the jupyter kernel and with the stop method I am shutting down the jupyter kernel. The current problem I am facing is how can I access that kernel instance inside the task while the task is running ? Is there any better way to override the Worker class definition

Mocking Celery `self.request` attribute for bound tasks when called directly

泄露秘密 提交于 2019-12-10 18:12:06
问题 I have a task foobar : @app.task(bind=True) def foobar(self, owner, a, b): if already_working(owner): # check if a foobar task is already running for owner. register_myself(self.request.id, owner) # add myself in the DB. return a + b How can I mock the self.request.id attribute? I am already patching everything and calling directly the task rather than using .delay/.apply_async , but the value of self.request.id seems to be None (as I am doing real interactions with DB, it is making the test

Celery Result error “args must be a list or tuple”

走远了吗. 提交于 2019-12-10 18:08:31
问题 I am running a Django website and have just gotten Celery to run, but I am getting confusing errors. Here is how the code is structured. In tests.py: from tasks import * from celery.result import AsyncResult project = Project.objects.create() # initalize various sub-objects of the project c = function.delay(project.id) r = AsyncResult(c.id).ready() f = AsyncResult(c.id).failed() # wait until the task is done while not r and not f: r = AsyncResult(c.id).ready() f = AsyncResult(c.id).failed()