python-rq

Passing results to depending on job - python rq

冷暖自知 提交于 2020-01-01 04:30:10
问题 How do I pass the result of a job to a job that depends on it? What I currently do is passing id of the first job to the second, first = queue.enqueue(firstJob) second = queue.enqueue(secondJob, first.id, depends_on=first); And inside secondJob fetching the first job to get the result first = queue.fetch_job(previous_job_id) print first.result Is this the recomended way? Is there any other pattern that I can use to directly pass first job's result to second? 回答1: You can access info about the

RQ concurrency with supervisord?

时光毁灭记忆、已成空白 提交于 2019-12-23 16:19:04
问题 All, I'm attempting to 'force' RQ workers to perform concurrently using supervisord. My setup supervisord setup seems to work fine, as rq-dashboard is showing 3 workers, 3 PID's and 3 queue (one for each worker/PID). Supervisord setup is as follows (showing only worker 1 setup, 2 more workers are defined below this one): [program:rqworker1] command = rqworker 1 process_name = rqworker1-%(process_num)s numprocs = 1 user = username autostart = True stdout_logfile=/tmp/rqworker1.log stdout

How to setup an RQ worker on Heroku with RedisCloud using Flask

杀马特。学长 韩版系。学妹 提交于 2019-12-13 04:55:17
问题 I'm attempting to create a python rq worker on heroku for my flask app. The heroku documentation provides the following example code for creating a worker: https://devcenter.heroku.com/articles/python-rq#create-a-worker import os import redis from rq import Worker, Queue, Connection listen = ['high', 'default', 'low'] redis_url = os.getenv('REDISTOGO_URL', 'redis://localhost:6379') conn = redis.from_url(redis_url) if __name__ == '__main__': with Connection(conn): worker = Worker(map(Queue,

Destroying / removing a Queue() in Redis Queue (rq) programmatically

安稳与你 提交于 2019-12-11 09:59:32
问题 Given: from redis import Redis from rq import Queue yesterday = Queue('yesterday', connection=Redis()) today = Queue('today', connection=Redis()) I would like to programmatically delete the Queue named 'yesterday' 回答1: Try the following (you can validate all of this with redis-cli): yesterday.empty() # This will wipe out rq:queue:yesterday and all of its contents del(yesterday) # Deletes the variable itself r = Redis() r.srem('rq:queues', 'rq:queue:yesterday') # Removed the entry from rq

the proper way to run django rq in docker microservices setup

徘徊边缘 提交于 2019-12-11 06:34:23
问题 I have somehow bad setup of my docker containers I guess. Because each time I run task from django I see in docker container output of ps aux that there is new process created of python mange.py rqworker mail instead of using the existing one. See the screencast: https://imgur.com/a/HxUjzJ5 the process executed from command in my docker compose for rq worker container looks like this. #!/bin/sh -e wait-for-it for KEY in $(redis-cli -h $REDIS_HOST -n 2 KEYS "rq:worker*"); do redis-cli -h

Should two modules use the same redis connection? (I'm working with Flask)

我只是一个虾纸丫 提交于 2019-12-11 04:06:04
问题 I'm building a Flask app that uses a Redis Queue. The code for the worker is: listen = ['default'] #redis_url = os.getenv('REDISTOGO_URL', 'redis://localhost:6379') conn = redis.from_url(redis_url) if __name__ == '__main__': with Connection(conn): worker = Worker(list(map(Queue, listen))) worker.work() And another module, app.py contains the code to handle Flask routes. My question is, should app.py create a new Redis connection as: q = Queue(connection= redis.from_url(redis_url)) q.enqueue

Get *all* current jobs from python-rq

别来无恙 提交于 2019-12-10 14:51:19
问题 I'm using python-rq to manage Redis-based jobs and I want to determine which jobs are currently being processed by my workers. python-rq offers a get_current_job function to find 'the current job' for a connection but: I can't get this to work, and I really want a list of all of the jobs which are being currently processed by all workers on all the queues for this connection rather than one job from one queue. Here is my code (which always returns None): from rq import Queue, get_current_job

How to clear Django RQ jobs from a queue?

放肆的年华 提交于 2019-12-07 03:19:40
问题 I feel a bit stupid for asking, but it doesn't appear to be in the documentation for RQ. I have a 'failed' queue with thousands of items in it and I want to clear it using the Django admin interface. The admin interface lists them and allows me to delete and re-queue them individually but I can't believe that I have to dive into the django shell to do it in bulk. What have I missed? 回答1: The Queue class has an empty() method that can be accessed like: import django_rq q = django_rq.get_failed

Is there a way to submit functions from __main__ using Python RQ

百般思念 提交于 2019-12-06 12:49:12
问题 In a similar vein to this question, is there any way to submit a function defined in the same file to python-rq? @GG_Python who asked me to create a new question for this. Usage example: # somemodule.py from redis import Redis from rq import Queue def somefunc(): do_something() q = Queue(connection=Redis('redis://redis')) q.enqueue(somefunc) Yes, I know the answer is to define somefunc in someothermodule.py and then in the above snippet from someothermodule import somefunc , but I really don

Is there a way to submit functions from __main__ using Python RQ

て烟熏妆下的殇ゞ 提交于 2019-12-04 20:20:43
In a similar vein to this question , is there any way to submit a function defined in the same file to python-rq? @GG_Python who asked me to create a new question for this. Usage example: # somemodule.py from redis import Redis from rq import Queue def somefunc(): do_something() q = Queue(connection=Redis('redis://redis')) q.enqueue(somefunc) Yes, I know the answer is to define somefunc in someothermodule.py and then in the above snippet from someothermodule import somefunc , but I really don't want to. Maybe I'm being too much of a stickler for form, but somefunc really belongs in the same