celery

Can Celery tasks survive restart?

血红的双手。 提交于 2019-12-19 05:44:41
问题 I need to build a system which handles two types of tasks. One type can create more tasks of itself or of the other type. There will be very few workers (2-3) and only one host. The most important requirement is that the system should handle restarts gracefully: i.e. on restart, tasks that were in progress should start from scratch and the workers should pick up tasks which were queued prior to restart. Looking at Celery it appears to be suitable for this use case. However, I have a couple of

报错(SQLite 3.8.3 or later is required (found 3.7.17).)的解决办法

泄露秘密 提交于 2019-12-18 23:38:54
在服务器上刚部署完Django项目, python manage.py runserver 0.0.0.0:8000 启动启动的时候报错:django.core.exceptions.ImproperlyConfigured: SQLite 3.8.3 or later is required (found 3.7.17).,以下为报错展示及解决办法。 目录 一、报错演示 二、问题解决 一、报错演示 ( venv ) [ root@localhost celery ] # python manage.py runserver 0.0.0.0:8000 Watching for file changes with StatReloader Exception in thread django - main - thread : Traceback ( most recent call last ) : File "/usr/local/python3/lib/python3.6/threading.py" , line 916 , in _bootstrap_inner self . run ( ) File "/usr/local/python3/lib/python3.6/threading.py" , line 864 , in run self . _target ( *

Celery Group task for use in a map/reduce workflow

北慕城南 提交于 2019-12-18 19:03:02
问题 Can I use a Celery Group primitive as the umbrella task in a map/reduce workflow? Or more specific: Can the subtasks in a Group be run on multiple workers on multiple servers ? From the docs: However, if you call apply_async on the group it will send a special grouping task, so that the action of calling the tasks happens in a worker instead of the current process That seems to imply the tasks are all send to one worker... Before 3.0 (and still) one could fire off the subtasks in a TaskSet

Retrieving GroupResult from taskset_id in Celery?

ぃ、小莉子 提交于 2019-12-18 15:47:50
问题 I am starting a set of celery tasks by using celery group as described in the official documentation I am also storing the group (taskset) id into a db, in order to poll celery for the taskset state. job = group([ single_test.s(1, 1), single_test.s(1, 2), single_test.s(1, 3), ]) result = job.apply_async() test_set = MyTestSet() test_set.taskset_id = result.id # store test_set into DB Is there a way to obtain a GroupResult object (i.e. my result ) starting from the taskset id? Something like

Register Celery Class-based Task

江枫思渺然 提交于 2019-12-18 13:07:16
问题 Python 3.x, Celery 4.x... I have a class-based task. myproj/celery.py from celery import Celery # django settings stuff... app = Celery('myproj') app.autodiscover_tasks() app1/tasks.py import celery class EmailTask(celery.Task): def run(self, *args, **kwargs): self.do_something() If I do: $ celery worker -A myproj -l info [tasks] . app2.tasks.debug_task . app2.tasks.test So, the celery decorators work to register tasks, but the class-based task is not registered. How do I get the class-based

How can I use PyCharm to locally debug a Celery worker? [duplicate]

99封情书 提交于 2019-12-18 12:48:01
问题 This question already has answers here : How do I enable remote celery debugging in PyCharm? (6 answers) Closed 3 years ago . I have an existing Django project with a virtualenv. After activating the venv, I can run Celery just with the command celery . This works on Windows, OS X and Linux. I wanted to try PyCharm on Windows, and I'm able to get it to run my Django server (using the project's venv), but I also want to run Celery, so I can debug that as well. I can't find a simple,

Using mock to patch a celery task in Django unit tests

情到浓时终转凉″ 提交于 2019-12-18 12:47:31
问题 I'm trying to use the python mock library to patch a Celery task that is run when a model is saved in my django app, to see that it's being called correctly. Basically, the task is defined inside myapp.tasks , and is imported at the top of my models.py-file like so: from .tasks import mytask ...and then runs on save() inside the model using mytask.delay(foo, bar) . So far so good - works out fine when I'm actually running Celeryd etc. I want to construct a unit test that mocks the task, just

Disable Django Debugging for Celery

青春壹個敷衍的年華 提交于 2019-12-18 12:27:37
问题 Is it possible to set DEBUG=False for only a specific app in Django? Celery has a known memory leak when debugging is enabled. I have a development server where I want Celery to run as a service, without debugging so it doesn't leak memory, but I want the rest of my Django app to use debugging so errors will be shown when testing. 回答1: Celery doesn't have a memory leak, it's how Django works: When DEBUG is enabled Django appends every executed SQL statement to django.db.connection.queries ,

Celery: clean way of revoking the entire chain from within a task

◇◆丶佛笑我妖孽 提交于 2019-12-18 11:56:39
问题 My question is probably pretty basic but still I can't get a solution in the official doc. I have defined a Celery chain inside my Django application, performing a set of tasks dependent from eanch other: chain( tasks.apply_fetching_decision.s(x, y), tasks.retrieve_public_info.s(z, x, y), tasks.public_adapter.s())() Obviously the second and the third tasks need the output of the parent, that's why I used a chain. Now the question: I need to programmatically revoke the 2nd and the 3rd tasks if

Python SSL connection “EOF occurred in violation of protocol”

做~自己de王妃 提交于 2019-12-18 11:47:11
问题 I'm using Django Celery task to connect to Facebook Graph API with requests lib using Gevent. Issue I'm constantly running at is that every now and then I get EOF occurred in violation of protocol exception. I've searched around and various sources offer different fixes but none seems to work. I've tried monkey patching the ssl module(gevent.monkey.patch_all()) and some others too but no luck. I'm not even sure if this is openssl issue as some sources might suggest as I haven't encountered it