celery-task

Celery Task Custom tracking method

混江龙づ霸主 提交于 2021-02-11 14:39:44
问题 My main problem relies on the fact that i need to know if a task is still queued, started or revoked. I cant do this with celery and redis because 24hs after the results are in redis they are deleted. I had some ideas but i think the most solid one is to have a database tracking and manually adding the critical information that i need of the task a user is running. There are methods for that that can run before a task start and i can also manually work with the database when i create task or

Django Celery Group tasks executing only the first task

余生颓废 提交于 2021-01-27 08:53:31
问题 I have celery which has different tasks and has one queue.These tasks are not however all called at once depending on the request from the user the tasks called vary. So i have written a code that identifies which tasks to run and creates subtasks with parameters and create a list of them.Add this list to group and use apply_async() on the group to run these tasks. The code for calling the tasks is as follows: tasks_list = [] for provider_name in params['providers']: provider = Provider

Django Celery Group tasks executing only the first task

。_饼干妹妹 提交于 2021-01-27 08:53:05
问题 I have celery which has different tasks and has one queue.These tasks are not however all called at once depending on the request from the user the tasks called vary. So i have written a code that identifies which tasks to run and creates subtasks with parameters and create a list of them.Add this list to group and use apply_async() on the group to run these tasks. The code for calling the tasks is as follows: tasks_list = [] for provider_name in params['providers']: provider = Provider

How to register Celery task to specific worker?

一笑奈何 提交于 2021-01-04 07:19:22
问题 I am developing web application in Python/Django, and I have several tasks which are running in celery. I have to run task A one at a time so I have created worker with --concurrency=1 and routed task A to that worker using following command. celery -A proj worker -Q A -c 1 -l INFO Everything is working fine as this worker handle task A and other tasks are routed to default queue. But, above worker return all task when I use inspect command to get registered task for worker. That is

Prioritizing queues among multiple queues in celery?

别来无恙 提交于 2020-12-29 10:00:10
问题 We are using celery for our asynchronous background tasks and we have 2 queues for different priority tasks. We have 2 cluster of nodes serving them separately. Things are working well as expected. Question: We get mostly low priority tasks. For optimized resource utilization, I am wondering is there a way to configure workers(listening to high priority queue) to listen to both queues. But take jobs from the higher priority queue as long as some job is there? and fallback to low priority

Airflow - Python file NOT in the same DAG folder

二次信任 提交于 2020-05-09 21:05:59
问题 I am trying to use Airflow to execute a simple task python. from __future__ import print_function from airflow.operators.python_operator import PythonOperator from airflow.models import DAG from datetime import datetime, timedelta from pprint import pprint seven_days_ago = datetime.combine(datetime.today() - timedelta(7), datetime.min.time()) args = { 'owner': 'airflow', 'start_date': seven_days_ago, } dag = DAG(dag_id='python_test', default_args=args) def print_context(ds, **kwargs): pprint

How to call a Celery shared_task?

|▌冷眼眸甩不掉的悲伤 提交于 2020-02-08 10:09:31
问题 I'm trying to use stream_framework in my application ( NOT Django ) but I'm having a problem calling the stream_framework shared tasks. Celery seems to find the tasks: -------------- celery@M3800 v3.1.25 (Cipater) ---- **** ----- --- * *** * -- Linux-4.15.0-34-generic-x86_64-with-Ubuntu-18.04-bionic -- * - **** --- - ** ---------- [config] - ** ---------- .> app: task:0x7f8d22176dd8 - ** ---------- .> transport: redis://localhost:6379/0 - ** ---------- .> results: redis://localhost:6379/0 - *