Flask: passing around background worker job (rq, redis)

僤鯓⒐⒋嵵緔 提交于 2019-12-03 03:08:01

I've not used rq before but I see that a job has a .key property. It might be easier to store that hash in your session. Then you can use the Job class's .fetch method which will itself call a .refresh() and return the job to you. Reading the .result() at that point would give you the job's current status.

Maybe like this (untested):

from rq.job import Job

@app.route('/make/')
def make():
    job = q.enqueue(do_something, 'argument')
    session['job'] = job.key
    return 'Done'

@app.route('/get/')
def get():
    try:
        job = Job()
        job.fetch(session['job'])
        out = str(job.result)
    except:
        out = 'No result yet'
    return out

Problem with serializing arguments (you actually trying to serialize function object which is impossible with pickle).

Try

@app.route('/make/')
def make():
    job = q.enqueue(func=do_something, args=('argument',))
    session['job'] = job
    return 'Done'
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!