celery

Difference between celery get and join

孤街醉人 提交于 2019-12-01 01:31:05
问题 Is there any difference between: r = group(some_task.s(i) for i in range(10)).apply_async() result = r.join() And: r = group(some_task.s(i) for i in range(10))() result = r.get() Celery document uses both examples and I do not see any difference. 回答1: Short answer While the get and join methods for a group should return the same results, get implements some caching and will probably be more efficient depending on the backend you're using. Unless you really need to use join for some edge case,

Run multiple Celery tasks using a topic exchange

北城余情 提交于 2019-12-01 01:30:28
I'm replacing some homegrown code with Celery, but having a hard time replicating the current behaviour. My desired behaviour is as follows: When creating a new user, a message should be published to the tasks exchange with the user.created routing key. Two Celery tasks should be trigged by this message, namely send_user_activate_email and check_spam . I tried implementing this by defining a user_created task with a ignore_result=True argument, plus a task for send_user_activate_email and check_spam . In my configuration, I added the following routes and queues definitions. While the message

Python - Retry a failed Celery task from another queue

会有一股神秘感。 提交于 2019-12-01 00:53:27
I'm posting a data to a web-service in Celery. Sometimes, the data is not posted to web-service because of the internet is down, and the task is retried infinite times until it is posted. The retrying of the task is un-necessary because the net was down and hence its not required to re-try it again. I thought of a better solution, ie if a task fails thrice (retrying a min of 3 times), then it is shifted to another queue. This queue contains list of all failed tasks. Now when the internet is up and the data is posted over the net , ie the task has been completed from the normal queue, it then

Celery: Callback after task hierarchy

风流意气都作罢 提交于 2019-12-01 00:03:14
问题 I'm using Celery from a webapp to start a task hierarchy. Tasks I'm using the following tasks: task_a task_b task_c notify_user A Django view starts several task_a instances. Each of them does some processing and then starts several task_b instances. And each of those does some processing and then starts several task_c instances. To visualize: Goals My goal is to execute all the tasks, and to run a callback function as soon as the entire hierarchy has finished. Additionally, I want to be able

celery

你离开我真会死。 提交于 2019-11-30 23:58:29
celery 性能优化的工具。 django是单进程的,celery能够在django框架内部开辟开启一个新进程,用来处理耗时的工作。 这样的好处是,用户可以快速获得响应。 celery的4个概念 任务task:就是一个Python函数 队列queue:将需要执行的任务加入到队列中 工人worker:在一个新进程中,负责执行队列中的任务 代理人broker:负责调度,在布置环境中使用redis。类似车间的队长,安排任务分配给工人干。 celery官方文档 中文:http://docs.jinkan.org/docs/celery/ 什么时候用celery 1、耗时 2、和响应结果没关系 安装包 celery==3.1.25 celery-with-redis==3.0 django-celery==3.1.17 示例 来源: https://www.cnblogs.com/andy9468/p/11645189.html

Getting task_id inside a Celery task

有些话、适合烂在心里 提交于 2019-11-30 23:41:34
问题 This is probably a stupid question but its got me stumped coming from a Ruby background. I have an object that looks like this when I try to print it. print celery.AsyncResult.task_id >>><property object at 0x10c383838> I was expecting the actual value of the task_id property to be printed here. How do I get to the actual value? UPDATE 1 @celery.task def scan(host): print celery.AsyncResult.task_id cmd = 'ps -ef' cm = shlex.split(cmd) scan = subprocess.check_output(cm) return scan Best

Celery periodic_task running multiple times in parallel

主宰稳场 提交于 2019-11-30 23:36:26
I have some very simple periodic code using Celery's threading; it simply prints "Pre" and "Post" and sleep in between. It is adapted from this StackOverflow question and this linked website from celery.task import task from celery.task import periodic_task from django.core.cache import cache from time import sleep import main import cutout_score from threading import Lock import socket from datetime import timedelta from celery.decorators import task, periodic_task def single_instance_task(timeout): def task_exc(func): def wrapper(*args, **kwargs): lock_id = "celery-single-instance-" + func._

Jobs not executing via Airflow that runs celery with RabbitMQ

筅森魡賤 提交于 2019-11-30 23:26:21
Below is the config im using [core] # The home folder for airflow, default is ~/airflow airflow_home = /root/airflow # The folder where your airflow pipelines live, most likely a # subfolder in a code repository dags_folder = /root/airflow/dags # The folder where airflow should store its log files. This location base_log_folder = /root/airflow/logs # An S3 location can be provided for log backups # For S3, use the full URL to the base folder (starting with "s3://...") s3_log_folder = None # The executor class that airflow should use. Choices include # SequentialExecutor, LocalExecutor,

Start celery worker throws “no attribute 'worker_state_db'”

柔情痞子 提交于 2019-11-30 22:17:50
问题 When I am trying to start celery worker in Django app as: celery -A myApp worker -l info I get following error: File "/home/alexander/.pyenv/versions/3.5.1/envs/myApp/lib/python3.5/site-packages/celery/utils/collections.py", line 134, in __getattr__ type(self).__name__, k)) AttributeError: 'Settings' object has no attribute 'worker_state_db' If you know how to solve it please write your idea! 回答1: The bug appears if an exception is raised while parsing settings. Such as when we set Django's

Celery - Schedule periodic task at the end of another task

别来无恙 提交于 2019-11-30 21:40:15
I want to schedule a periodic task with Celery dynamically at the end of another group of task. I know how to create (static) periodic tasks with Celery: CELERYBEAT_SCHEDULE = { 'poll_actions': { 'task': 'tasks.poll_actions', 'schedule': timedelta(seconds=5) } } But I want to create periodic jobs dynamically from my tasks (and maybe have a way to stop those periodic jobs when some condition is achieved (all tasks done). Something like: @celery.task def run(ids): group(prepare.s(id) for id in ids) | execute.s(ids) | poll.s(ids, schedule=timedelta(seconds=5)) @celery.task def prepare(id): ...