celery-task

How to call a celery task delay function from non-python languages such as Java?

大城市里の小女人 提交于 2019-12-03 14:31:58
I have setup celery + rabbitmq for on a 3 cluster machine. I have also created a task which generates a regular expression based on data from the file and uses the information to parse text. from celery import Celery celery = Celery('tasks', broker='amqp://localhost//') import re @celery.task def add(x, y): return x + y def get_regular_expression(): with open("text") as fp: data = fp.readlines() str_re = "|".join([x.split()[2] for x in data ]) return str_re @celery.task def analyse_json(tw): str_re = get_regular_expression() re.match(str_re,tw.text) I can make the call to this task very easily

import error in celery

烂漫一生 提交于 2019-12-03 09:58:56
this is the code which i am running: from __future__ import absolute_import from celery import Celery celery1 = Celery('celery',broker='amqp://',backend='amqp://',include=['tasks']) celery1.conf.update( CELERY_TASK_RESULT_EXPIRES=3600, ) if __name__ == '__main__': celery1.start() when i execute the above code it gives me the following error: ImportError: cannot import name Celery Jaredp37 I ran into this same error as well and renaming the file fixed it. For anyone else encountering this, the reason WHY you get this issue, your local celery.py is getting imported instead of the actual celery

How to make a celery task fail from within the task?

不打扰是莪最后的温柔 提交于 2019-12-03 04:20:58
Under some conditions, I want to make a celery task fail from within that task. I tried the following: from celery.task import task from celery import states @task() def run_simulation(): if some_condition: run_simulation.update_state(state=states.FAILURE) return False However, the task still reports to have succeeded: Task sim.tasks.run_simulation[9235e3a7-c6d2-4219-bbc7-acf65c816e65] succeeded in 1.17847704887s: False It seems that the state can only be modified while the task is running and once it is completed - celery changes the state to whatever it deems is the outcome (refer to this

How can I run a celery periodic task from the shell manually?

人走茶凉 提交于 2019-12-03 02:54:23
问题 I'm using celery and django-celery. I have defined a periodic task that I'd like to test. Is it possible to run the periodic task from the shell manually so that I view the console output? 回答1: Have you tried just running the task from the Django shell? You can use the .apply method of a task to ensure that it is run eagerly and locally. Assuming the task is called my_task in Django app myapp in a tasks submodule: $ python manage.py shell >>> from myapp.tasks import my_task >>> eager_result =

Celery: list all tasks, scheduled, active *and* finished

左心房为你撑大大i 提交于 2019-12-02 20:29:25
Update for the bounty I'd like a solution that does not involve a monitoring thread, if possible. I know I can view scheduled and active tasks using the Inspect class of my apps Control . i = myapp.control.inspect() currently_running = i.active() scheduled = i.scheduled() But I could not find any function to show already finished tasks. I know that this information mus be at least temporarily accessible, because I can look up a finished task by its task_id : >>> r = my task.AsyncResult(task_id=' ... ') >>> r.state u'SUCCESS' How can I get a complete list of scheduled, active and finished tasks

How can I run a celery periodic task from the shell manually?

荒凉一梦 提交于 2019-12-02 16:28:28
I'm using celery and django-celery. I have defined a periodic task that I'd like to test. Is it possible to run the periodic task from the shell manually so that I view the console output? Have you tried just running the task from the Django shell? You can use the .apply method of a task to ensure that it is run eagerly and locally. Assuming the task is called my_task in Django app myapp in a tasks submodule: $ python manage.py shell >>> from myapp.tasks import my_task >>> eager_result = my_task.apply() The result instance has the same API as the usual AsyncResult type, except that the result

How to start a task only when all other tasks have finished in Celery

℡╲_俬逩灬. 提交于 2019-12-02 09:56:42
In Celery, I want to start a task only when all the other tasks have completed. I found some resources like this one : Celery Starting a Task when Other Tasks have Completed and Running a task after all tasks have been completed But I am quite new to celery and could not really understand the above (or many other resources for that matter). So I have defined a task as so in a tasks.py : @celapp.task() def sampleFun(arg1, arg2, arg3): # do something here and I call it as this : for x in xrange(4): tasks.sampleFun.delay(val1, val2, val3) And I assume that there would be 4 different tasks created

Celery worker ImportError: No module named 'project'

喜欢而已 提交于 2019-12-02 06:37:40
While I tried to start the worker I got a issue: ImportError: No module named 'project' Traceback (most recent call last): File "/usr/local/bin/celery", line 11, in <module> sys.exit(main()) File "/usr/local/lib/python3.5/dist-packages/celery/__main__.py", line 16, in main _main() File "/usr/local/lib/python3.5/dist-packages/celery/bin/celery.py", line 322, in main cmd.execute_from_commandline(argv) File "/usr/local/lib/python3.5/dist-packages/celery/bin/celery.py", line 496, in execute_from_commandline super(CeleryCommand, self).execute_from_commandline(argv))) File "/usr/local/lib/python3.5

Celery worker ImportError: No module named 'project'

夙愿已清 提交于 2019-12-02 06:35:40
问题 While I tried to start the worker I got a issue: ImportError: No module named 'project' Traceback (most recent call last): File "/usr/local/bin/celery", line 11, in <module> sys.exit(main()) File "/usr/local/lib/python3.5/dist-packages/celery/__main__.py", line 16, in main _main() File "/usr/local/lib/python3.5/dist-packages/celery/bin/celery.py", line 322, in main cmd.execute_from_commandline(argv) File "/usr/local/lib/python3.5/dist-packages/celery/bin/celery.py", line 496, in execute_from

Celery is rerunning long running completed tasks over and over

狂风中的少年 提交于 2019-12-01 07:18:25
问题 I've a python celery-redis queue processing uploads and downloads worth gigs and gigs of data at a time. Few of the uploads takes upto few hours. However once such a task finishes, I'm witnessing this bizarre celery behaviour that the celery scheduler is rerunning the just concluded task again by sending it again to the worker (I'm running a single worker) And it just happened 2times on the same task! Can someone help me know why is this happening and how can I prevent it? The tasks are