celery

celery daemon - permission denied on log file

冷暖自知 提交于 2019-12-07 04:54:43
问题 I have been working on setting up my celery task as a daemon in order to process data on a schedule. I have been following the docs in order to set up my daemon, but have been running into a log file permission error that has me stumped. Below is the configuration i have set up on an ubuntu box on Digital Ocean /etc/default/celeryd # here we have a single node CELERYD_NODES="w1" CELERY_BIN = "/mix_daemon/venv/bin/celery" CELERYD_CHDIR="/mix_daemon/" CELERYD_OPTS="-A tasks worker --loglevel

Broadcast messages in celery

寵の児 提交于 2019-12-07 04:54:43
问题 I'm using celery and want to send broadcast task to couple of workers. I'm trying to do it like is described on http://docs.celeryproject.org/en/latest/userguide/routing.html#broadcast so I create simple app with task: @celery.task def do_something(value): print value and in app I made: from kombu.common import Broadcast CELERY_QUEUES = (Broadcast('broadcast_tasks'), ) CELERY_ROUTES = {'my_app.do_something': {'queue': 'broadcast_tasks'}} and then I was trying to send task to workers with: my

Django celery crontab every 30 seconds - is it even possible?

℡╲_俬逩灬. 提交于 2019-12-07 04:39:43
问题 Is it possible to run the django celery crontab very 30 seconds DURING SPECIFIC HOURS ? There are only settings for minutes, hours and days. I have the crontab working, but I'd like to run it every 30 seconds, as opposed to every minute. Alternatively... Is it possible to turn the interval on for 30 seconds, and turn on the interval schedule only for a certain period of the day? 回答1: Very first example they have in the documentation is... Example: Run the tasks.add task every 30 seconds. from

How do I permanently remove a celery task from rabbitMQ?

眉间皱痕 提交于 2019-12-07 04:26:06
问题 I have around 10,000 scheduled tasks on my current celery setup. I didn't realize what scheduled tasks were and decided to use them to send follow-up emails months in advance. Looking back, it's probably never a good idea to schedule a task for more than 1 hour in the future as every time you restart a worker it has to re-receive every scheduled task from rabbitMQ and then they all just sit in the memory. My problem is that if I have to revoke a task, it doesn't just delete it. The task stays

Route celery task to specific queue

旧巷老猫 提交于 2019-12-07 04:15:41
问题 I have two separate celeryd processes running on my server, managed by supervisor . They are set to listen on separate queues as such: [program:celeryd1] command=/path/to/celeryd --pool=solo --queues=queue1 ... [program:celeryd2] command=/path/to/celeryd --pool=solo --queues=queue2 ... And my celeryconfig looks something like this: from celery.schedules import crontab BROKER_URL = "amqp://guest:guest@localhost:5672//" CELERY_DISABLE_RATE_LIMITS = True CELERYD_CONCURRENCY = 1 CELERY_IGNORE

can't import django model into celery task

天大地大妈咪最大 提交于 2019-12-07 03:40:00
问题 i have the following task: from __future__ import absolute_import from myproject.celery import app from myapp.models import Entity @app.task def add(entity_id): entity = Entity.objects.get(pk=entity_id) return entity.name I get the following error: django.core.exceptions.ImproperlyConfigured: Requested setting DEFAULT_INDEX_TABLESPACE, but settings are not configured. You must either define the environment variable DJANGO_SETTINGS_MODULE or call settings.configure() before accessing settings.

Django Celery beat crashes on start

假装没事ソ 提交于 2019-12-07 03:30:52
问题 I have recently configured a new server with RabbitMQ and Celery. When I try to start Celerybeat on the machine it starts for a few seconds and stops. I have given right permissions to the log files and changed the owners to the Application user. I have also checked the celerybeat.log file and NO errors are registered. I tried to start it this way in the project folder: ./manage.py celerybeat And I got this error: [2010-12-01 09:59:46,127: INFO/MainProcess] process shutting down Could someone

How to manually mark a Celery task as done and set its result?

自古美人都是妖i 提交于 2019-12-07 03:27:07
问题 I have this Celery task: @app.task def do_something(with_this): # instantiate a class from a third party library instance = SomeClass() # this class uses callbacks to send progress info about # the status and progress of what we're doing def progress_callback(data): # this status will change to 'finished' later # but the return value that I want as the task result won't be returned # so this is where I should mark the task as done manually if data['status'] == 'working': # I create a custom

AsyncResult(task_id) returns “PENDING” state even after the task started

三世轮回 提交于 2019-12-07 03:22:46
问题 In the project, I try to poll task.state of a long running task and update its running status. It worked in the development, but it won't work when I move the project on production server. I kept getting 'PENDING' even I can see the task started on flower. However, I can still get the results updated when the task finished, which when task.state == 'SUCCESS'. I use python 2.6, Django 1.6 and Celery 3.1 in the production, result backend AMQP. @csrf_exempt def poll_state(request): data = 'Fail'

Celery tries to connect to the wrong broker

↘锁芯ラ 提交于 2019-12-07 03:22:35
问题 I have in my celery configuration BROKER_URL = 'redis://127.0.0.1:6379' CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379' Yet whenever I run the celeryd, I get this error consumer: Cannot connect to amqp://guest@127.0.0.1:5672//: [Errno 111] Connection refused. Trying again in 2.00 seconds... Why is it not connecting to the redis broker I set it up with, which is running btw? 回答1: import your celery and add your broker like that : celery = Celery('task', broker='redis://127.0.0.1:6379') celery