celery

Make Django test case database visible to Celery

时光怂恿深爱的人放手 提交于 2019-11-27 03:37:34
问题 When a Django test case runs, it creates an isolated test database so that database writes get rolled back when each test completes. I am trying to create an integration test with Celery, but I can't figure out how to connect Celery to this ephemeral test database. In the naive setup, Objects saved in Django are invisible to Celery and objects saved in Celery persist indefinitely. Here is an example test case: import json from rest_framework.test import APITestCase from myapp.models import

What is the maximum value size you can store in redis?

拜拜、爱过 提交于 2019-11-27 03:02:35
问题 Does anyone know what the maximum value size you can store in redis? I want to use redis as a message queue with celery to store some small documents that need to be processed by a worker on another server, and I want to make sure the documents aren't going to be too big. I found one page with a reference to 1GB, but when I followed the link on the page for where they got that answer the link wasn't valid anymore. Here is the link: http://news.ycombinator.com/item?id=1182005 Thanks, Ken 回答1:

Celery: is there a way to write custom JSON Encoder/Decoder?

匆匆过客 提交于 2019-11-27 02:06:07
问题 I have some objects I want to send to celery tasks on my application. Those objects are obviously not json serializable using the default json library. Is there a way to make celery serialize/de-serialize those objects with custom JSON Encoder / Decoder ? 回答1: A bit late here, but you should be able to define a custom encoder and decoder by registering them in the kombu serializer registry, as in the docs: http://docs.celeryproject.org/en/latest/userguide/calling.html#serializers. For example

How to run celery on windows?

我的未来我决定 提交于 2019-11-27 01:51:21
问题 How to run celery worker on Windows without creating Windows Service? Is there any analogy to $ celery -A your_application worker ? 回答1: Celery 4.0+ does not officially support window already. But it still works on window for some development/test purpose. Use eventlet instead as below: pip install eventlet celery -A <module> worker -l info -P eventlet It works for me on window 10 + celery 4.1 + python 3 . This solution solved the following exception: [2017-11-16 21:19:46,938: ERROR

Add n tasks to celery queue and wait for the results

社会主义新天地 提交于 2019-11-27 01:32:51
问题 I would add multiple tasks to celery queue and wait for results. I have various ideas how I would achieve this utilising some form of shared storage (memcached, redis, db, etc.), however, I would have thought it's something that Celery can handle automatically but I can't find any resources online. Code example def do_tasks(b): for a in b: c.delay(a) return c.all_results_some_how() 回答1: For Celery >= 3.0 , TaskSet is deprecated in favour of group. from celery import group from tasks import

Setting Time Limit on specific task with celery

对着背影说爱祢 提交于 2019-11-27 01:00:40
问题 I have a task in Celery that could potentially run for 10,000 seconds while operating normally. However all the rest of my tasks should be done in less than one second. How can I set a time limit for the intentionally long running task without changing the time limit on the short running tasks? 回答1: You can set task time limits (hard and/or soft) either while defining a task or while calling. from celery.exceptions import SoftTimeLimitExceeded @celery.task(time_limit=20) def mytask(): try:

Celery - Get task id for current task

南楼画角 提交于 2019-11-27 00:49:31
问题 How can I get the task_id value for a task from within the task? Here's my code: from celery.decorators import task from django.core.cache import cache @task def do_job(path): "Performs an operation on a file" # ... Code to perform the operation ... cache.set(current_task_id, operation_results) The idea is that when I create a new instance of the task, I retrieve the task_id from the task object. I then use the task id to determine whether the task has completed. I don't want to keep track of

Retry Lost or Failed Tasks (Celery, Django and RabbitMQ)

纵然是瞬间 提交于 2019-11-27 00:47:08
问题 Is there a way to determine if any task is lost and retry it? I think that the reason for lost can be dispatcher bug or worker thread crash. I was planning to retry them but I'm not sure how to determine which tasks need to be retired? And how to make this process automatically? Can I use my own custom scheduler which will create new tasks? Edit: I found from the documentation that RabbitMQ never loose tasks, but what happens when worker thread crash in the middle of task execution? 回答1: What

How to start a Celery worker from a script/module __main__?

安稳与你 提交于 2019-11-27 00:46:24
问题 I've define a Celery app in a module, and now I want to start the worker from the same module in its __main__ , i.e. by running the module with python -m instead of celery from the command line. I tried this: app = Celery('project', include=['project.tasks']) # do all kind of project-specific configuration # that should occur whenever this module is imported if __name__ == '__main__': # log stuff about the configuration app.start(['worker', '-A', 'project.tasks']) but now Celery thinks I'm

How to keep multiple independent celery queues?

心不动则不痛 提交于 2019-11-27 00:22:05
问题 I'm trying to keep multiple celery queues with different tasks and workers in the same redis database. Really just a convenience issue of only wanting one redis server rather than two on my machine. I followed the celery tutorial docs verbatim, as it as the only way to get it to work for me. Now when I try to duplicate everything with slightly tweaked names/queues, it keeps erroring out. Note - I'm a newish to Python and Celery, which is obviously part of the problem. I'm not sure which parts