celery

Unit testing with django-celery?

本小妞迷上赌 提交于 2019-11-28 13:41:12
问题 I am trying to come up with a testing methodology for our django-celery project. I have read the notes in the documentation, but it didn't give me a good idea of what to actually do. I am not worried about testing the tasks in the actual daemons, just the functionality of my code. Mainly I am wondering: How can we bypass task.delay() during the test (I tried setting CELERY_ALWAYS_EAGER = True but it made no difference)? How do we use the test settings that are recommended (if that is the best

Celery: how to limit number of tasks in queue and stop feeding when full?

不想你离开。 提交于 2019-11-28 13:29:39
I am very new to Celery and here is the question I have: Suppose I have a script that is constantly supposed to fetch new data from DB and send it to workers using Celery. tasks.py # Celery Task from celery import Celery app = Celery('tasks', broker='amqp://guest@localhost//') @app.task def process_data(x): # Do something with x pass fetch_db.py # Fetch new data from DB and dispatch to workers. from tasks import process_data while True: # Run DB query here to fetch new data from DB fetched_data process_data.delay(fetched_data) sleep(30); Here is my concern: the data is being fetched every 30

Celery parallel distributed task with multiprocessing

╄→гoц情女王★ 提交于 2019-11-28 13:23:37
问题 I have a CPU intensive Celery task. I would like to use all the processing power (cores) across lots of EC2 instances to get this job done faster (a celery parallel distributed task with multiprocessing - I think ) . The terms, threading , multiprocessing , distributed computing , distributed parallel processing are all terms I'm trying to understand better. Example task: @app.task for item in list_of_millions_of_ids: id = item # do some long complicated equation here very CPU heavy!!!!!!!

celery doesn't work with global variable

懵懂的女人 提交于 2019-11-28 11:14:56
from celery import Celery app = Celery('tasks', backend='amqp://guest@localhost//', broker='amqp://guest@localhost//') a_num = 0 @app.task def addone(): global a_num a_num = a_num + 1 return a_num this is the code I used to test celery. I hope every time I use addone() the return value should increase. But it's always 1 why??? Results python >> from tasks import addone >> r = addone.delay() >> r.get() 1 >> r = addone.delay() >> r.get() 1 >> r = addone.delay() >> r.get() 1 By default when a worker is started Celery starts it with a concurrency of 4, which means it has 4 processes started to

【Python celery】

允我心安 提交于 2019-11-28 10:48:17
目录 原文: http://blog.gqylpy.com/gqy/380 安装: pip install celery celery 是基于 Python 实现的模块,用于执行异步定时周期任务。 celery 组成结构: 用户任务 app : 用于生成任务 管道 broker 与 backend :前者用于存放任务,后者用于存放任务执行结果 员工 worker :负责执行任务 @(Python celery) 简单示例 员工文件(workers.py): import time from celery import Celery # 创建一个Celery实例,这个就是我们用户的应用app my_task = Celery( 'tasks', broker='redis://127.0.0.1:6380', # 指定存放任务的地方,这个指定为redis backend='redis://127.0.0.1:6380', # 指定存放任务执行结果的地方 ) # 为应用创建任务 @my_task.task def fn1(x, y): time.sleep(10) return x + y """ 执行命令: Linux:celery worker -A workers -l INFO Windows:celery worker -A workers -l INFO -P

Celery + Eventlet + non blocking requests

这一生的挚爱 提交于 2019-11-28 10:07:48
I am using Python requests in celery workers to make large number of (~10/sec) API calls(includes GET,POST, PUT, DELETE). Each request takes around 5-10s to complete. I tried running celery workers in eventlet pool, with 1000 concurrency. Since requests are blocking process each concurrent connection is waiting on one request. How do I make requests asynchronous? temoto Use eventlet monkey patching to make any pure python library non-blocking. patch single library # import requests # instead do this: import eventlet requests = eventlet.import_patched('requests') packages erequests and

What is the maximum value size you can store in redis?

做~自己de王妃 提交于 2019-11-28 09:37:23
Does anyone know what the maximum value size you can store in redis? I want to use redis as a message queue with celery to store some small documents that need to be processed by a worker on another server, and I want to make sure the documents aren't going to be too big. I found one page with a reference to 1GB, but when I followed the link on the page for where they got that answer the link wasn't valid anymore. Here is the link: http://news.ycombinator.com/item?id=1182005 Thanks, Ken It's my understanding that key sizes are limited to 2 GiB. That means that the maximum amount of data you

Celery scheduled list returns None

社会主义新天地 提交于 2019-11-28 08:30:01
问题 I'm fairly new to Celery and I've been attempting setup a simple script to schedule and unschedule tasks. However I feel like I'm running into a weird issue. I have the following setup from celery import Celery app = Celery('celery_test', broker='amqp://', backend='amqp') @app.task def add(x, y): return x + y I start up my celery server just fine and can add tasks. Now when I want to get a list of active tasks things seem to get weird. When I goto use inspect to get a list of scheduled tasks

Celery with RabbitMQ: AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for'

妖精的绣舞 提交于 2019-11-28 07:55:09
I'm running the First Steps with Celery Tutorial . We define the following task: from celery import Celery app = Celery('tasks', broker='amqp://guest@localhost//') @app.task def add(x, y): return x + y Then call it: >>> from tasks import add >>> add.delay(4, 4) But I get the following error: AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for' I'm running both the celery worker and the rabbit-mq server. Rather strangely, celery worker reports the task as succeeding: [2014-04-22 19:12:03,608: INFO/MainProcess] Task test_celery.add[168c7d96-e41a-41c9-80f5-50b24dcaff73]

Add n tasks to celery queue and wait for the results

こ雲淡風輕ζ 提交于 2019-11-28 07:26:09
I would add multiple tasks to celery queue and wait for results. I have various ideas how I would achieve this utilising some form of shared storage (memcached, redis, db, etc.), however, I would have thought it's something that Celery can handle automatically but I can't find any resources online. Code example def do_tasks(b): for a in b: c.delay(a) return c.all_results_some_how() laffuste For Celery >= 3.0 , TaskSet is deprecated in favour of group . from celery import group from tasks import add job = group([ add.s(2, 2), add.s(4, 4), add.s(8, 8), add.s(16, 16), add.s(32, 32), ]) Start the