celery

Celery: list all tasks, scheduled, active *and* finished

左心房为你撑大大i 提交于 2019-12-02 20:29:25
Update for the bounty I'd like a solution that does not involve a monitoring thread, if possible. I know I can view scheduled and active tasks using the Inspect class of my apps Control . i = myapp.control.inspect() currently_running = i.active() scheduled = i.scheduled() But I could not find any function to show already finished tasks. I know that this information mus be at least temporarily accessible, because I can look up a finished task by its task_id : >>> r = my task.AsyncResult(task_id=' ... ') >>> r.state u'SUCCESS' How can I get a complete list of scheduled, active and finished tasks

Celery raises ValueError: not enough values to unpack

五迷三道 提交于 2019-12-02 19:43:28
Trying to run simple example with Celery and receiving an exception. RabbitMQ started in a Docker, also tried to start it locally. Celery works on a local Windows host from celery import Celery app = Celery('tasks', broker='amqp://192.168.99.100:32774') @app.task() def hello(): print('hello') if __name__ == '__main__': hello.delay() Excerpt of my error text: [2017-08-18 00:01:08,628: INFO/MainProcess] Received task: tasks.hello[8d33dbea-c5d9-4938-ab1d-0646eb1a3858] [2017-08-18 00:01:08,632: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0

How to properly configure djcelery results backend to database

随声附和 提交于 2019-12-02 19:37:30
I'm trying to setup djangocelery to store task results in the databse. I set: CELERY_RESULT_BACKEND = 'djcelery.backends.database.DatabaseBackend' then I synced and migrated the db (no errors). Celery is working and tasks get processed (I can get the results), but admin shows there is no tasks. In the database are two tables celery_taskmeta and djcelery_taskmeta . First one is holding the results and second one is displayed in admin. Anyone has insight how to configure it properly? Check the doc , when you use djcelery, set CELERY_RESULT_BACKEND="database" or don't even bother to write this

celery.beat implementation of crontab

旧时模样 提交于 2019-12-02 19:14:37
问题 My task is to add a new stream in Wowza media server which must take place at a user specified time. Currently I'm using crontab together with http provider for this purpose. I want a celery.beat implementation for this. Can anyone help? 回答1: If this is a one-off task to be executed at a specific time then you don't need to use Periodic Tasks (celerybeat). Rather you can use the eta/countdown argument to task.apply_async : task.apply_async(eta=datetime(2012, 07, 1, 14, 30)) task.apply_async

Starting flask server in background

流过昼夜 提交于 2019-12-02 19:10:33
I have a flask application which I am currently starting up in the following way: #phantom.py __author__ = 'uruddarraju' from phantom.api.v1 import app app.run(host='0.0.0.0', port=8080, debug=True) and when I run this script, it executes successfully by printing: loading config from /home/uruddarraju/virtualenvs/PHANTOMNEW/Phantom/etc/phantom/phantom.ini * Running on http://0.0.0.0:8080/ But it never returns and if I do a CTRL-C the server stops. I am trying to deploy this to production and want to run this startup on the background, where the process stays up as long as the server is up.

Celery - How to send task from remote machine?

旧巷老猫 提交于 2019-12-02 19:09:43
We have a server running celery workers and a Redis queue. The tasks are defined on that server. I need to be able to call these tasks from a remote machine. I know that it is done using send_task but I still haven't figured out HOW? How do I tell send_task where the queue is? Where do I pass connection params (or whatever needed)? I've been looking for hours and all I can find is this: from celery.execute import send_task send_task('tasks.add') Well, that means that I need celery on my calling machine as well. But what else do I need to set up? This may be a way: Creating a Celery object and

Recover from task failed beyond max_retries

别说谁变了你拦得住时间么 提交于 2019-12-02 19:08:49
I am attempting to asynchronously consume a web service because it takes up to 45 seconds to return. Unfortunately, this web service is also somewhat unreliable and can throw errors. I have set up django-celery and have my tasks executing, which works fine until the task fails beyond max_retries . Here is what I have so far: @task(default_retry_delay=5, max_retries=10) def request(xml): try: server = Client('https://www.whatever.net/RealTimeService.asmx?wsdl') xml = server.service.RunRealTimeXML( username=settings.WS_USERNAME, password=settings.WS_PASSWORD, xml=xml ) except Exception, e:

How to disallow pickle serialization in celery

☆樱花仙子☆ 提交于 2019-12-02 18:45:55
Celery defaults to using pickle as its serialization method for tasks. As noted in the FAQ , this represents a security hole. Celery allows you to configure how tasks get serialized using the CELERY_TASK_SERIALIZER configuration parameter. But this doesn't solve the security problem. Even if tasks are serialized with JSON or similar, the workers will still execute tasks inserted into the queue with pickle serialization -- they just respond to the content-type parameter in the message. So anybody who can write to the task queue can effectively pown the worker processes by writing malicious

How to programmatically generate celerybeat entries with celery and Django

≯℡__Kan透↙ 提交于 2019-12-02 18:31:15
I am hoping to be able to programmatically generate celerybeat entries and resync celerybeat when entries are added. The docs here state By default the entries are taken from the CELERYBEAT_SCHEDULE setting, but custom stores can also be used, like storing the entries in an SQL database. So I am trying to figure out which classes i need to extend to be able to do this. I have been looking at celery scheduler docs and djcelery api docs but the documentation on what some of these methods do is non-existent so about to dive into some source and was just hoping someone could point me in the right

how to run celery with django on openshift 3

只谈情不闲聊 提交于 2019-12-02 18:24:20
问题 What is the easiest way to launch a celery beat and worker process in my django pod? I'm migrating my Openshift v2 Django app to Openshift v3. I'm using Pro subscription. I'm really a noob on Openshift v3 and docker and containers and kubernetes. I have used this tutorial https://blog.openshift.com/migrating-django-applications-openshift-3/ to migrate my app (which works pretty well). I'm now struggling on how to start celery. On Openshift 2 I just used an action hook post_start: source