python-rq

Cancel an already executing task in Python RQ?

Deadly 提交于 2021-02-08 12:45:55
问题 I am using http://python-rq.org/ to queue and execute tasks on Heroku worker dynos. These are long-running tasks and occasionally I need to cancel them in mid-execution. How do I do that from Python? from redis import Redis from rq import Queue from my_module import count_words_at_url q = Queue(connection=Redis()) result = q.enqueue( count_words_at_url, 'http://nvie.com') and later in a separate process I want to do: from redis import Redis from rq import Queue from my_module import count

Cancel an already executing task in Python RQ?

大憨熊 提交于 2021-02-08 12:45:47
问题 I am using http://python-rq.org/ to queue and execute tasks on Heroku worker dynos. These are long-running tasks and occasionally I need to cancel them in mid-execution. How do I do that from Python? from redis import Redis from rq import Queue from my_module import count_words_at_url q = Queue(connection=Redis()) result = q.enqueue( count_words_at_url, 'http://nvie.com') and later in a separate process I want to do: from redis import Redis from rq import Queue from my_module import count

Cancel an already executing task in Python RQ?

狂风中的少年 提交于 2021-02-08 12:45:01
问题 I am using http://python-rq.org/ to queue and execute tasks on Heroku worker dynos. These are long-running tasks and occasionally I need to cancel them in mid-execution. How do I do that from Python? from redis import Redis from rq import Queue from my_module import count_words_at_url q = Queue(connection=Redis()) result = q.enqueue( count_words_at_url, 'http://nvie.com') and later in a separate process I want to do: from redis import Redis from rq import Queue from my_module import count

Redis queue worker crashes in utcparse

杀马特。学长 韩版系。学妹 提交于 2021-01-29 16:13:11
问题 I'm trying to get a basic rq working following the tutorial at https://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-xxii-background-jobs. I'm running on windows 10 WSL1 ubuntu 20.04. I installed rq using sudo apt-get install python3-rq and rq is at version 1.2.2 I installed the python lib using pip3 install rq and this is then at version 1.4.0. My worker code is in app/tasks.py and is import time def example(): print('Starting task') for i in range(1..10): print(i) #time.sleep(1)

How to create a ``depends_on`` relationship between scheduled and queued jobs in python-rq

笑着哭i 提交于 2020-01-24 04:09:21
问题 I have a web service (Python 3.7, Flask 1.0.2) with a workflow consisting of 3 steps: Step 1: Submitting a remote compute job to a commercial queuing system (IBM's LSF) Step 2: Polling every 61 seconds for the remote compute job status (61 seconds because of cached job status results) Step 3: Data post-processing if step 2 returns remote compute job status == "DONE" The remote compute job is of arbitrary length (between seconds and days) and each step is dependent on the completion of the

How to create a ``depends_on`` relationship between scheduled and queued jobs in python-rq

感情迁移 提交于 2020-01-24 04:09:05
问题 I have a web service (Python 3.7, Flask 1.0.2) with a workflow consisting of 3 steps: Step 1: Submitting a remote compute job to a commercial queuing system (IBM's LSF) Step 2: Polling every 61 seconds for the remote compute job status (61 seconds because of cached job status results) Step 3: Data post-processing if step 2 returns remote compute job status == "DONE" The remote compute job is of arbitrary length (between seconds and days) and each step is dependent on the completion of the

Interact with celery ongoing task

ⅰ亾dé卋堺 提交于 2020-01-22 10:24:06
问题 We have a distributed architecture based on rabbitMQ and Celery . We can launch in parallel multiple tasks without any issue. The scalability is good. Now we need to control the task remotely: PAUSE, RESUME, CANCEL. The only solution we found is to make in the Celery task a RPC call to another task that replies the command after a DB request. The Celery task and RPC task are not on the same machine and only the RPC task has access to the DB. Do you have any advice how to improve it and easily

python-rq worker closes automatically

六眼飞鱼酱① 提交于 2020-01-06 20:19:31
问题 I am implementing python-rq to pass domains in a queue and scrape it using Beautiful Soup. So i am running multiple workers to get the job done. I started 22 workers as of now, and all the 22 workers is registered in the rq dashboard. But after some time the worker stops by itself and is not getting displayed in dashboard. But in webmin, it displays all workers as running. The speed of crawling has also decreased i.e. the workers are not running. I tried running the worker using supervisor