celery

delete Task / PeriodicTask in celery

橙三吉。 提交于 2019-12-22 04:00:39
问题 How can I delete a regular Task or PeriodicTask in celery? 回答1: You revoke the task: See documentation: Control.revoke(task_id, destination=None, terminate=False, signal='SIGTERM', **kwargs) Tell all (or specific) workers to revoke a task by id. If a task is revoked, the workers will ignore the task and not execute it after all. Parameters: task_id – Id of the task to revoke. terminate – Also terminate the process currently working on the task (if any). signal – Name of signal to send to

How to route tasks to different queues with Celery and Django

Deadly 提交于 2019-12-21 16:57:53
问题 I am using the following stack: Python 3.6 Celery v4.2.1 (Broker: RabbitMQ v3.6.0 ) Django v2.0.4 . According Celery's documentation, running scheduled tasks on different queues should be as easy as defining the corresponding queues for the tasks on CELERY_ROUTES , nonetheless all tasks seem to be executed on Celery's default queue. This is the configuration on my_app/settings.py : CELERY_BROKER_URL = "amqp://guest:guest@localhost:5672//" CELERY_ROUTES = { 'app1.tasks.*': {'queue': 'queue1'},

How to run a function periodically with Flask and Celery?

心不动则不痛 提交于 2019-12-21 15:43:06
问题 I have a flask app that roughly looks like this: app = Flask(__name__) @app.route('/',methods=['POST']) def foo(): data = json.loads(request.data) # do some stuff return "OK" Now in addition I would like to run a function every ten seconds from that script. I don't want to use sleep for that. I have the following celery script in addition: from celery import Celery from datetime import timedelta celery = Celery('__name__') CELERYBEAT_SCHEDULE = { 'add-every-30-seconds': { 'task': 'tasks.add',

How to run a function periodically with Flask and Celery?

家住魔仙堡 提交于 2019-12-21 15:41:05
问题 I have a flask app that roughly looks like this: app = Flask(__name__) @app.route('/',methods=['POST']) def foo(): data = json.loads(request.data) # do some stuff return "OK" Now in addition I would like to run a function every ten seconds from that script. I don't want to use sleep for that. I have the following celery script in addition: from celery import Celery from datetime import timedelta celery = Celery('__name__') CELERYBEAT_SCHEDULE = { 'add-every-30-seconds': { 'task': 'tasks.add',

Chain a celery task's results into a distributed group

僤鯓⒐⒋嵵緔 提交于 2019-12-21 12:23:15
问题 Like in this other question, I want to create a celery group from a list that's returned by a celery task. The idea is that the first task will return a list, and the second task will explode that list into concurrent tasks for every item in the list. The plan is to use this while downloading content. The first task gets links from a website, and the second task is a chain that downloads the page, processes it, and then uploads it to s3. Finally, once all the subpages are done, the website is

Add, modify, remove celery.schedules at run time

让人想犯罪 __ 提交于 2019-12-21 10:53:08
问题 is there a way to Add, modify, remove celery.schedules at run time. I need something that reads a db table periodically to know list of schedules. Document says one can use djcelery.schedulers.DatabaseScheduler to achieve what I want, but not sure how to do it. I read How to dynamically add / remove periodic tasks to Celery (celerybeat), still not clear Thanks for help 回答1: When you set in your app settings: CELERYBEAT_SCHEDULER='djcelery.schedulers.DatabaseScheduler' celery beat proces

Can celery assign task to specify worker

陌路散爱 提交于 2019-12-21 10:50:09
问题 Celery will send task to idle workers. I have a task will run every 5 seconds, and I want this task to only be sent to one specify worker. Other tasks can share the left over workers Can celery do this?? And I want to know what this parameter is: CELERY_TASK_RESULT_EXPIRES Does it means that the task will not be sent to a worker in the queue? Or does it stop the task if it runs too long? 回答1: Sure, you can. Best way to do it, separate celery workers using different queues. You just need to

Get worker ID in Celery

霸气de小男生 提交于 2019-12-21 09:03:05
问题 I want to use Celery to run jobs on a GPU server with four Tesla cards. I run the Celery worker with a pool of four workers such that each card always runs one job. My problem is how to instruct the workers to each claim one GPU. Currently I rely on the assumption that the worker processes should all have contiguous process IDs: device_id = os.getpid() % self.ndevices However, I this is not guaranteed to always work, i.e. when worker processes get restarted over time. So ideally, I would like

Get worker ID in Celery

安稳与你 提交于 2019-12-21 09:02:42
问题 I want to use Celery to run jobs on a GPU server with four Tesla cards. I run the Celery worker with a pool of four workers such that each card always runs one job. My problem is how to instruct the workers to each claim one GPU. Currently I rely on the assumption that the worker processes should all have contiguous process IDs: device_id = os.getpid() % self.ndevices However, I this is not guaranteed to always work, i.e. when worker processes get restarted over time. So ideally, I would like

Get worker ID in Celery

偶尔善良 提交于 2019-12-21 09:02:42
问题 I want to use Celery to run jobs on a GPU server with four Tesla cards. I run the Celery worker with a pool of four workers such that each card always runs one job. My problem is how to instruct the workers to each claim one GPU. Currently I rely on the assumption that the worker processes should all have contiguous process IDs: device_id = os.getpid() % self.ndevices However, I this is not guaranteed to always work, i.e. when worker processes get restarted over time. So ideally, I would like