celery

Python - Retry a failed Celery task from another queue

人盡茶涼 提交于 2019-11-30 19:16:32
问题 I'm posting a data to a web-service in Celery. Sometimes, the data is not posted to web-service because of the internet is down, and the task is retried infinite times until it is posted. The retrying of the task is un-necessary because the net was down and hence its not required to re-try it again. I thought of a better solution, ie if a task fails thrice (retrying a min of 3 times), then it is shifted to another queue. This queue contains list of all failed tasks. Now when the internet is

python celery介绍和基本使用

和自甴很熟 提交于 2019-11-30 19:14:41
08 python celery介绍和基本使用 celery分布式任务队列 RPC远程,当执行一条命令,等待远程执行结果返回客户端。 在Linux上可以在后台执行,不影响其他任务执行。(涉及到异步) 1、分布式任务运算celery 参考: https://www.cnblogs.com/alex3714/p/6351797.html 任务计划: https://www.cnblogs.com/peida/archive/2013/01/08/2850483.html Crontab操作系统本身任务计划 Celery也可以实现定时任务,不需要操作系统。 Rabbitmq也可以实现异步。 2、测试代码: Celery在Windows上执行有问题,在Linux上使用。 [root@backup testcleery]# celery -A celery_test worker -l debug 启动过程中可能需要调整环境变量。 export C_FORCE_ROOT=True [root@backup testcleery]# export C_FORCE_ROOT=True [root@backup testcleery]# celery -A celery_test worker -l info 查看日志,任务模块加载成功。Celery两个模块都已加载下来。 测试celery模块。

Python celery: Retrieve tasks arguments if there's an exception

送分小仙女□ 提交于 2019-11-30 19:05:29
I am getting started with Celery and Python, and I have a question that is probably very simple, but I don't seem to be able to find any suitable answer around... If I have a bunch of tasks, and one of them throws, an exception, is there a way of retrieving the arguments that were passed to said task? For instance, if I want to get the IPs some hostnames resolve to, and I create a task... @tasks_app.task def resolve_hostname(hostname): return (hostname, {hst.address for hst in dns.resolver.query(hostname)}) ... which can throw an exception, is there a way of getting the value of that hostname

Celery periodic_task running multiple times in parallel

↘锁芯ラ 提交于 2019-11-30 18:39:41
问题 I have some very simple periodic code using Celery's threading; it simply prints "Pre" and "Post" and sleep in between. It is adapted from this StackOverflow question and this linked website from celery.task import task from celery.task import periodic_task from django.core.cache import cache from time import sleep import main import cutout_score from threading import Lock import socket from datetime import timedelta from celery.decorators import task, periodic_task def single_instance_task

Jobs not executing via Airflow that runs celery with RabbitMQ

淺唱寂寞╮ 提交于 2019-11-30 18:35:21
问题 Below is the config im using [core] # The home folder for airflow, default is ~/airflow airflow_home = /root/airflow # The folder where your airflow pipelines live, most likely a # subfolder in a code repository dags_folder = /root/airflow/dags # The folder where airflow should store its log files. This location base_log_folder = /root/airflow/logs # An S3 location can be provided for log backups # For S3, use the full URL to the base folder (starting with "s3://...") s3_log_folder = None #

Make Celery use Django's test database without task_always_eager

旧时模样 提交于 2019-11-30 18:17:00
When running tests in Django applications that make use of Celery tasks I can't fully test tasks that need to get data from the database since they don't connect to the test database that Django creates. Setting task_always_eager in Celery to True partially solves this problem but as the documentation for testing says, this doesn't fully reflect how the code will run on a real Celery worker and isn't suitable for testing. How can I make Celery tasks use the Django test database when running Django tests without setting task_always_eager = True ? Short = You must run celery worker as in

How to put rate limit on a celery queue?

China☆狼群 提交于 2019-11-30 17:23:16
I read this in celery documentation : Task.rate_limit http://celery.readthedocs.org/en/latest/userguide/tasks.html#Task.rate_limit Note that this is a per worker instance rate limit, and not a global rate limit. To enforce a global rate limit (e.g. for an API with a maximum number of requests per second), you must restrict to a given queue. How to put rate limit on celery queue? Thanks for not down voting the question. Turns out it cant be done at queue level for multiple workers. IT can be done at queue level for 1 worker. Or at queue level for each worker. So if u say 10 jobs/ minute on 5

Running background Celery task in Flask

我的梦境 提交于 2019-11-30 17:02:15
Problem has been updated to include progress made I have the following code and my celery tasks kick off fine, I just don't know where I should store the async result so that I can look at it again later #!/usr/bin/env python """Page views.""" from flask import render_template, request from flask import Flask from celerytest import add from time import sleep app = Flask(__name__) async_res = [] @app.route('/', methods=['GET', 'POST']) def run(): if request.method == 'GET': return render_template("template.html") else: form = request.form n1 = str(form.get("n1")) n2 = str(form.get("n2")) aysnc

Celery workers unable to connect to redis on docker instances

旧时模样 提交于 2019-11-30 16:50:15
问题 I have a dockerized setup running a Django app within which I use Celery tasks. Celery uses Redis as the broker. Versioning: Docker version 17.09.0-ce, build afdb6d4 docker-compose version 1.15.0, build e12f3b9 Django==1.9.6 django-celery-beat==1.0.1 celery==4.1.0 celery[redis] redis==2.10.5 Problem: My celery workers appear to be unable to connect to the redis container located at localhost:6379. I am able to telnet into the redis server on the specified port. I am able to verify redis

Celery的简单使用介绍笔记。

断了今生、忘了曾经 提交于 2019-11-30 16:21:15
Celery的相关模块 任务模块 Task 包含异步任务和定时任务. 其中, 异步任务通常在业务逻辑中被触发并发往任务队列, 而定时任务由 Celery Beat 进程周期性地将任务发往任务队列. 消息中间件 Broker(中间人) Broker, 即为任务调度队列, 接收任务生产者发来的消息(即任务), 将任务存入队列. Celery 本身不提供队列服务, 官方推荐使用 RabbitMQ 和 Redis 等. 任务执行单元 Worker Worker 是执行任务的处理单元, 它实时监控消息队列, 获取队列中调度的任务, 并执行它. 任务结果存储 Backend Backend 用于存储任务的执行结果, 以供查询. 同消息中间件一样, 存储也可使用 RabbitMQ, Redis 和 MongoDB 等 我在使用中,使用在Django的上传文件到oss服务器中,由于改操作不影响用户后续操作,如果让用户等带来不好的用户体验,可以用Celery进行异步操作。 Celery其实是一个独立的进程,开辟了一条独立的进程用于接受任务模块提交过来的任务。 上一个项目示意图 首相需要安装运行环境, pip install 'celery[redis]' #此安装模式告知,redis依赖安装在celery下 开始必须运行本地redis 先上一个最简单的Celery代码。 import time