django-celery

How can I run a celery periodic task from the shell manually?

人走茶凉 提交于 2019-12-03 02:54:23
问题 I'm using celery and django-celery. I have defined a periodic task that I'd like to test. Is it possible to run the periodic task from the shell manually so that I view the console output? 回答1: Have you tried just running the task from the Django shell? You can use the .apply method of a task to ensure that it is run eagerly and locally. Assuming the task is called my_task in Django app myapp in a tasks submodule: $ python manage.py shell >>> from myapp.tasks import my_task >>> eager_result =

Create celery tasks then run synchronously

限于喜欢 提交于 2019-12-03 00:13:37
My app gathers a bunch of phone numbers on a page. Once the user hits the submit button I create a celery task to call each number and give a reminder message then redirect them to a page where they can see the live updates about the call. I am using web sockets to live update the status of each call and need the tasks to execute synchronously as I only have access to dial out from one number. So once the first call/task is completed, I want the next one to fire off. I took a look at CELERY_ALWAYS_EAGER settings but it just went through the first iteration and stopped. @task def reminder

Celery: Worker with concurrency and reserved tasks only running 1 task

扶醉桌前 提交于 2019-12-02 22:57:47
Some of the tasks in my code were taking longer and longer to execute. Upon inspection I noticed that although I have my worker node set to concurrency 6, and 6 processes exist to 'do work', but only 1 task is shown under 'running tasks'. Here is a little visual proof: Here are the worker options: And here is the task tab for that worker with only 1 running process: I have found that if I restart celery, the concurrency is once again respected and i will see >1 running task, but after some amount of time/tasks it reverts back to this behavior.. Any ideas for fixing this intermittent problem? I

Recover from task failed beyond max_retries

别说谁变了你拦得住时间么 提交于 2019-12-02 19:08:49
I am attempting to asynchronously consume a web service because it takes up to 45 seconds to return. Unfortunately, this web service is also somewhat unreliable and can throw errors. I have set up django-celery and have my tasks executing, which works fine until the task fails beyond max_retries . Here is what I have so far: @task(default_retry_delay=5, max_retries=10) def request(xml): try: server = Client('https://www.whatever.net/RealTimeService.asmx?wsdl') xml = server.service.RunRealTimeXML( username=settings.WS_USERNAME, password=settings.WS_PASSWORD, xml=xml ) except Exception, e:

How to programmatically generate celerybeat entries with celery and Django

≯℡__Kan透↙ 提交于 2019-12-02 18:31:15
I am hoping to be able to programmatically generate celerybeat entries and resync celerybeat when entries are added. The docs here state By default the entries are taken from the CELERYBEAT_SCHEDULE setting, but custom stores can also be used, like storing the entries in an SQL database. So I am trying to figure out which classes i need to extend to be able to do this. I have been looking at celery scheduler docs and djcelery api docs but the documentation on what some of these methods do is non-existent so about to dive into some source and was just hoping someone could point me in the right

Django & Celery — Routing problems

非 Y 不嫁゛ 提交于 2019-12-02 17:35:12
I'm using Django and Celery and I'm trying to setup routing to multiple queues. When I specify a task's routing_key and exchange (either in the task decorator or using apply_async() ), the task isn't added to the broker (which is Kombu connecting to my MySQL database). If I specify the queue name in the task decorator (which will mean the routing key is ignored), the task works fine. It appears to be a problem with the routing/exchange setup. Any idea what the problem could be? Here's the setup: settings.py INSTALLED_APPS = ( ... 'kombu.transport.django', 'djcelery', ) BROKER_BACKEND = 'django

Examples of Django and Celery: Periodic Tasks

僤鯓⒐⒋嵵緔 提交于 2019-12-02 17:34:07
I have been fighting the Django/Celery documentation for a while now and need some help. I would like to be able to run Periodic Tasks using django-celery. I have seen around the internet (and the documentation) several different formats and schemas for how one should go about achieving this using Celery... Can someone help with a basic, functioning example of the creation, registration and execution of a django-celery periodic task? In particular, I want to know whether I should write a task that extends the PeriodicTask class and register that, or whether I should use the @periodic_task

How can I run a celery periodic task from the shell manually?

荒凉一梦 提交于 2019-12-02 16:28:28
I'm using celery and django-celery. I have defined a periodic task that I'd like to test. Is it possible to run the periodic task from the shell manually so that I view the console output? Have you tried just running the task from the Django shell? You can use the .apply method of a task to ensure that it is run eagerly and locally. Assuming the task is called my_task in Django app myapp in a tasks submodule: $ python manage.py shell >>> from myapp.tasks import my_task >>> eager_result = my_task.apply() The result instance has the same API as the usual AsyncResult type, except that the result

Celery worker ImportError: No module named 'project'

喜欢而已 提交于 2019-12-02 06:37:40
While I tried to start the worker I got a issue: ImportError: No module named 'project' Traceback (most recent call last): File "/usr/local/bin/celery", line 11, in <module> sys.exit(main()) File "/usr/local/lib/python3.5/dist-packages/celery/__main__.py", line 16, in main _main() File "/usr/local/lib/python3.5/dist-packages/celery/bin/celery.py", line 322, in main cmd.execute_from_commandline(argv) File "/usr/local/lib/python3.5/dist-packages/celery/bin/celery.py", line 496, in execute_from_commandline super(CeleryCommand, self).execute_from_commandline(argv))) File "/usr/local/lib/python3.5

Celery worker ImportError: No module named 'project'

夙愿已清 提交于 2019-12-02 06:35:40
问题 While I tried to start the worker I got a issue: ImportError: No module named 'project' Traceback (most recent call last): File "/usr/local/bin/celery", line 11, in <module> sys.exit(main()) File "/usr/local/lib/python3.5/dist-packages/celery/__main__.py", line 16, in main _main() File "/usr/local/lib/python3.5/dist-packages/celery/bin/celery.py", line 322, in main cmd.execute_from_commandline(argv) File "/usr/local/lib/python3.5/dist-packages/celery/bin/celery.py", line 496, in execute_from