celery

Using django-nose and django-celery together — unit testing

风流意气都作罢 提交于 2019-12-13 12:56:50
问题 I have a django project that used django-nose. I'd like to add django-celery to the project. I use unit tests. Both django-nose and django-celery need a TEST_RUNNER setting in my settings.py file. Specifically: TEST_RUNNER = 'django_nose.NoseTestSuiteRunner' for django-nose and: TEST_RUNNER = 'djcelery.contrib.test_runner.CeleryTestSuiteRunner' for django-celery. How should I handle this so that I can use both packages? 回答1: I found that the best way to handle this is to skip the Celery test

Using multiprocessing pool from celery task raises exception

一个人想着一个人 提交于 2019-12-13 11:40:29
问题 FOR THOSE READING THIS: I have decided to use RQ instead which doesn't fail when running code that uses the multiprocessing module. I suggest you use that. I am trying to use a multiprocessing pool from within a celery task using Python 3 and redis as the broker (running it on a Mac). However, I don't seem to be able to even create a multiprocessing Pool object from within the Celery task! Instead, I get a strange exception that I really don't know what to do with. Can anyone tell me how to

Trouble installing supervisord with celery

﹥>﹥吖頭↗ 提交于 2019-12-13 07:34:44
问题 I have a Django project on an Ubuntu EC2 node, which I have been using to set up an asynchronous using Celery . I am following http://michal.karzynski.pl/blog/2014/05/18/setting-up-an-asynchronous-task-queue-for-django-using-celery-redis/ along with the docs. I've also looked at Run a celery worker in the background. and http://thomassileo.com/blog/2012/08/20/how-to-keep-celery-running-with-supervisor/ In /etc/supervisor/conf.d/tp-celery.conf I have: [program:tp-celery] command=/home/ubuntu

need to restart python while applying Celery config

我怕爱的太早我们不能终老 提交于 2019-12-13 07:03:13
问题 That's a small story... I had this error: AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for' When changed tasks.py, like Diederik said at Celery with RabbitMQ: AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for' app = Celery('tasks', backend='rpc://', broker='amqp://guest@localhost//') ran it >>> from tasks import add >>> result = add.delay(4,50) >>> result.ready() got DisabledBackend again ... hmm what was that.. put code to file run.py

Celery: Make sure workers are not running only jobs from one user

拜拜、爱过 提交于 2019-12-13 05:18:53
问题 I have 4 celery workers each with concurrency of 6. I have users submitting varying number of jobs (from 1 to 20). How do I ensure that each user's job get equal processing time, and that one user's job do not fill up the queue forcing other user's jobs to wait. I am afraid if the workers are ending up going through all the jobs submitted by the first user, the other user's queued jobs must wait first user to finish, an inconvenience. Is there a way to make the celery workers aware of one

How can I configure celery to run on startup of nginx?

爷,独闯天下 提交于 2019-12-13 04:05:08
问题 I have celery running locally by just running celery -A proj -l info (although I don't even know if I should be using this command in production), and I want to get celery running on my production web server every time nginx starts. The init system is systemd 回答1: Create a service file like this celery.service [Unit] Description=celery service After=network.target [Service] PIDFile=/run/celery/pid User=celery Group=celery RuntimeDirectory=/path/to/project WorkingDirectory=/path/to/project

Celery - Permission Problem - Create folder

China☆狼群 提交于 2019-12-13 03:57:14
问题 I use celery (jobs manager) on prod mode for a website (Django) on a centos7 server. My problem is that in a celery task my function did not create folder (see my_function ). the function def my_fucntion(): parent_folder = THE_PARENT_PATH if not os.path.exists(centrifuge_recentrifuge_work_dir_path): os.makedirs(centrifuge_recentrifuge_work_dir_path) # The folder THE_PARENT_PATH is created celery_task(parent_folder) the celery task @app.task(name='a task') def celery_task(parent_folder):

Ubuntu service Upstart or SystemD, Django development server as Service

删除回忆录丶 提交于 2019-12-13 03:46:33
问题 I've been working around with Python & Django Framework for a while with Ubuntu 16.01. Since I used Django with Q system (Celery) and some others Enhancement Apps. When I try to run all the apps each time, I need to run development server "{python manage.py runserver}", then running Celery Worker "{celery -A filename worker -l info}". Each time I working, it takes me minutes to enter the directory and start it up. I surf around and come up with the idea of setup it as service. Example,

Start SQS celery worker on Elastic Beanstalk

寵の児 提交于 2019-12-13 03:35:57
问题 I am trying to start a celery worker on EB but get an error which doesn't explain much. Command in config file in .ebextensions dir : 03_celery_worker: command: "celery worker --app=config --loglevel=info -E --workdir=/opt/python/current/app/my_project/" The listed command works fine on my local machine (just change workdir parameter). Errors from the EB: Activity execution failed, because: /opt/python/run/venv/local/lib/python3.6/site-packages/celery/platforms.py:796: RuntimeWarning: You're

Celery discovers all tasks even when `app.autodiscover_tasks()` is not called

大憨熊 提交于 2019-12-13 03:35:12
问题 I am using Django==2.0.5 and celery==4.0.2 . My proj/proj/celery.py looks like: from __future__ import absolute_import, unicode_literals import os from celery import Celery # set the default Django settings module for the 'celery' program. os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings') app = Celery('proj', include=[]) # Using a string here means the worker doesn't have to serialize # the configuration object to child processes. # - namespace='CELERY' means all celery-related