celery

Find out whether celery task exists

只愿长相守 提交于 2019-11-28 19:07:04
Is it possible to find out whether a task with a certain task id exists? When I try to get the status, I will always get pending. >>> AsyncResult('...').status 'PENDING' I want to know whether a given task id is a real celery task id and not a random string. I want different results depending on whether there is a valid task for a certain id. There may have been a valid task in the past with the same id but the results may have been deleted from the backend. asksol Celery does not write a state when the task is sent, this is partly an optimization (see http://docs.celeryproject.org/en/latest

day-90selery

我的梦境 提交于 2019-11-28 19:01:42
selery 什么是Clelery:   处理大量消息的分布式系统,专注于实时处理的异步任务队列,同时也支持任务调度 Celery架构:   Celery的架构由三部分组成,消息中间件(message broker)、任务执行单元(worker)和 任务执行结果存储(task result store)组成。    消息中间件:使用Redis    任务执行单元:由Celery提供    任务结果存储:使用Redis 使用场景:   异步任务:将耗时操作任务提交给Celery去异步执行,比如发送短信/邮件、消息推送、音视频处理等等   定时任务:定时执行某件事情,比如每天数据统计 Celery的安装配置:   pip install enentlet   pip install celery 基本使用 基本使用 1.创建py文件:celery_app_task.py import celery import time broker = 'redis://127.0.0.1:6379/0' backend = 'redis://127.0.0.1:6379/1' app = celery.Celery('test',backend=backend,broker=broker) @app.task def add(x, y): time.sleep(1) return x + y 2

Interoperating with Django/Celery From Java

笑着哭i 提交于 2019-11-28 18:54:14
Our company has a Python based web site and some Python based worker nodes which communicate via Django/Celery and RabbitMQ. I have a Java based application which needs to submit tasks to the Celery based workers. I can send jobs to RabbitMQ from Java just fine, but the Celery based workers are never picking up the jobs. From looking at the packet captures of both types of job submissions, there are differences, but I cannot fathom how to account for them because a lot of it is binary that I cannot find documentation about decoding. Does anyone here have any reference or experience with having

How to write an Ubuntu Upstart job for Celery (django-celery) in a virtualenv

℡╲_俬逩灬. 提交于 2019-11-28 18:52:25
I really enjoy using upstart. I currently have upstart jobs to run different gunicorn instances in a number of virtualenvs. However, the 2-3 examples I found for Celery upstart scripts on the interwebs don't work for me. So, with the following variables, how would I write an Upstart job to run django-celery in a virtualenv. Path to Django Project: /srv/projects/django_project Path to this project's virtualenv: /srv/environments/django_project Path to celery settings is the Django project settings file (django-celery): /srv/projects/django_project/settings.py Path to the log file for this

Setting Time Limit on specific task with celery

我的梦境 提交于 2019-11-28 18:33:50
I have a task in Celery that could potentially run for 10,000 seconds while operating normally. However all the rest of my tasks should be done in less than one second. How can I set a time limit for the intentionally long running task without changing the time limit on the short running tasks? mher You can set task time limits ( hard and/or soft ) either while defining a task or while calling. from celery.exceptions import SoftTimeLimitExceeded @celery.task(time_limit=20) def mytask(): try: return do_work() except SoftTimeLimitExceeded: cleanup_in_a_hurry() or mytask.apply_async(args=[],

Celery - Get task id for current task

我是研究僧i 提交于 2019-11-28 18:05:10
How can I get the task_id value for a task from within the task? Here's my code: from celery.decorators import task from django.core.cache import cache @task def do_job(path): "Performs an operation on a file" # ... Code to perform the operation ... cache.set(current_task_id, operation_results) The idea is that when I create a new instance of the task, I retrieve the task_id from the task object. I then use the task id to determine whether the task has completed. I don't want to keep track of the task by the path value because the file is "cleaned up" after the task completes, and may or may

How to create Celery Windows Service?

感情迁移 提交于 2019-11-28 18:02:32
I'm trying to create a Windows Service to launch Celery. I have come across an article that does it using Task Scheduler . However it seems to launch numerous celery instances and keeps eating up memory till the machine dies. Is there any way to launch it as a Windows service? Vite Falcon I got the answer from another website. Celeryd (daemon service for Celery) runs as a paster application, searching for 'Paster Windows Service' lead me here . It describes how to run a Pylons application as a Windows Service. Being new to paster framework and hosting python web services, it didn't cross my

How to run celery as a daemon in production?

依然范特西╮ 提交于 2019-11-28 17:05:54
i created a celeryd file in /etc/defaults/ from the code here: https://github.com/celery/celery/blob/3.0/extra/generic-init.d/celeryd Now when I want to run celeryd as a daemon and do this: sudo /etc/init.d/celerdy it says command not found. Where am I going wrong? Rohan I am not sure what you are doing here but these are the steps to run celery as a daemon. The file that you have referred in the link https://github.com/celery/celery/blob/3.0/extra/generic-init.d/celeryd needs to be copied in your /etc/init.d folder with the name celeryd Then you need to create a configuration file in the

Connect from one Docker container to another

左心房为你撑大大i 提交于 2019-11-28 17:05:20
I want to run rabbitmq-server in one docker container and connect to it from another container using celery ( http://celeryproject.org/ ) I have rabbitmq running using the below command... sudo docker run -d -p :5672 markellul/rabbitmq /usr/sbin/rabbitmq-server and running the celery via sudo docker run -i -t markellul/celery /bin/bash When I am trying to do the very basic tutorial to validate the connection on http://docs.celeryproject.org/en/latest/getting-started/first-steps-with-celery.html I am getting a connection refused error: consumer: Cannot connect to amqp://guest@127.0.0.1:5672//:

Detect whether Celery is Available/Running

烈酒焚心 提交于 2019-11-28 16:49:42
I'm using Celery to manage asynchronous tasks. Occasionally, however, the celery process goes down which causes none of the tasks to get executed. I would like to be able to check the status of celery and make sure everything is working fine, and if I detect any problems display an error message to the user. From the Celery Worker documentation it looks like I might be able to use ping or inspect for this, but ping feels hacky and it's not clear exactly how inspect is meant to be used (if inspect().registered() is empty?). Any guidance on this would be appreciated. Basically what I'm looking