celery

Celery workers missing heartbeats and getting substantial drift over Ec2

依然范特西╮ 提交于 2019-12-23 11:01:26
问题 I am testing my celery implementation over 3 ec2 machines right now. I am pretty confident in my implementation now, but I am getting problems with the actual worker execution. My test structure is as follows: 1 ec2 machine is designated as the broker, also runs a celery worker 1 ec2 machine is designated as the client (runs the client celery script that enqueues all the tasks using .delay(), also runs a celery worker 1 ec2 machine is purely a worker. All the machines have 1 celery worker

Django-celery : Passing request Object to worker

若如初见. 提交于 2019-12-23 09:37:51
问题 How can i pass django request object to celery worker. When try to pass the request object it throws a Error Can't Pickle Input Objects It seems that celery serialize any arguments passed to worker. I tried using other serialization methods like JSON. CELERY_TASK_SERIALIZER = "JSON" But it is not working. Is it possible to configure celery so that it won't serialize data. Or can i convert request object to a string before passing to worker and then convert again back to object in worker.

Can I get a celery task's arguments if all I have is the task ID?

岁酱吖の 提交于 2019-12-23 09:26:47
问题 If I have the original task I can get the arguments from task.request.args , but if I only have the task ID is there a way to get the arguments? It doesn't look like there is a way to get them from an AsyncResult object, and as far as I can tell there isn't a way to re-create the task. I want to do this because I have a frontend that polls the backend for updates on tasks, and it would be useful if it could display the task arguments. Seeing as the arguments are stored with the broker, this

ImproperlyConfigured(“settings.DATABASES is improperly configured. ”) error when trying to set up Django

谁说我不能喝 提交于 2019-12-23 07:48:44
问题 Attempting to follow the instructions here to set up a Django instance on Heroku. Got as far as the installation of Celery, up to the following step: $ python manage.py syncdb when I get the following error: raise ImproperlyConfigured("settings.DATABASES is improperly configured. "django.core.exceptions.ImproperlyConfigured: settings.DATABASES is improperly configured. Please supply the ENGINE value. Check settings documentation for more details. I believe that I have my settings.py file in

Globally accessible object across all Celery workers / memory cache in Django

人盡茶涼 提交于 2019-12-23 07:43:55
问题 I have pretty standard Django+Rabbitmq+Celery setup with 1 Celery task and 5 workers. Task uploads the same (I simplify a bit) big file (~100MB) asynchronously to a number of remote PCs. All is working fine at the expense of using lots of memory, since every task/worker load that big file into memory separatelly. What I would like to do is to have some kind of cache, accessible to all tasks, i.e. load the file only once. Django caching based on locmem would be perfect, but like documentation

python celery - ImportError: No module named _curses - while attempting to run manage.py celeryev

折月煮酒 提交于 2019-12-23 07:11:15
问题 Background Windows 7 x 64 Python 2.7 Django 1.4 Celery with Redis bundle While trying to run manage.py celeryev, I get the following error in the terminal import curses File 'c:\Python2\lib\curses\__init__.py', line 15, in <module> from _curses import * ImportError: No module named _curses I've tried looking at other posts, but haven't been able to solve this problem. Any thoughts on what is causing this error? Thanks in advance. 回答1: According to http://docs.python.org/library/curses.html

Celery - one task in one second

好久不见. 提交于 2019-12-23 04:54:38
问题 I use Celery to make requests to the server (in tasks). I have hard limit - only 1 request in one second (from one ip). I read this, so its what I want - 1/s. In celeryconfig.py I have: CELERY_DISABLE_RATE_LIMITS = False CELERY_DEFAULT_RATE_LIMIT = "1/s" But I have the messages, that I have too many requests per second. In call.py I use groups. I think, rate_limits does not work, because I have a mistake in celeryconfig.py . How to fix that? Thanks! 回答1: When you start a celery worker with

Celery result.get times out

白昼怎懂夜的黑 提交于 2019-12-23 04:42:53
问题 I have two different django projects say projA and projB , each have its own celery daemon running on separate queues but same vhost, projA have a task taskA and projB have a task taskB , I try to run taskB from inside taskA e.g. @task(routing_key='taskA') def taskA(event_id): # do some work , then call taskB and wait for result result = send_task('taskB',routing_key='taskB') res = result.get(timeout=20) I can see in logs of projB that taskB finished within a second, but taskA keeps on

periodic tasks in Celery 4.0

谁都会走 提交于 2019-12-23 04:30:14
问题 As I know, since Celery 3.1 decorator @periodic_task is depricated. So I am trying to run an example from celery docs, and can't realise, what am I doing wrong. I have the following code in task_planner.py : from celery import Celery from kombu import Queue, Exchange class Config(object): CELERY_QUEUES = ( Queue( 'try', exchange=Exchange('try'), routing_key='try', ), ) celery = Celery('tasks', backend='redis://', broker='redis://localhost:6379/0') celery.config_from_object(Config) celery.conf

Djando Celery: Celery task does not create record in DB

 ̄綄美尐妖づ 提交于 2019-12-23 03:07:27
问题 I want to create database records with celery task. But for some reason object.save() method is not working with task.apply_async() (Apply tasks asynchronousy). Same record (Ticker) is saved in the database with a celery task while running it locally: get_all_tickers.apply() But is not saved with asynchronous mode: get_all_tickers.apply_async() In both cases INSERT statement is visible in the server log. models.py class Ticker(TimeStampedModel): ask = models.DecimalField(max_digits=18,