Multithreading for Python Django

依然范特西╮ 提交于 2019-11-27 09:36:50

问题


Some functions should run asynchronously on the web server. Sending emails or data post-processing are typical use cases.

What is the best (or most pythonic) way write a decorator function to run a function asynchronously?

My setup is a common one: Python, Django, Gunicorn or Waitress, AWS EC2 standard Linux

For example, here's a start:

from threading import Thread

def postpone(function):
    def decorator(*args, **kwargs):
        t = Thread(target = function, args=args, kwargs=kwargs)
        t.daemon = True
        t.start()
    return decorator

desired usage:

@postpone
def foo():
    pass #do stuff

回答1:


I've continued using this implementation at scale and in production with no issues.

Decorator definition:

def start_new_thread(function):
    def decorator(*args, **kwargs):
        t = Thread(target = function, args=args, kwargs=kwargs)
        t.daemon = True
        t.start()
    return decorator

Example usage:

@start_new_thread
def foo():
  #do stuff

Over time, the stack has updated and transitioned without fail.

Originally Python 2.4.7, Django 1.4, Gunicorn 0.17.2, now Python 3.6, Django 2.1, Waitress 1.1.

If you are using any database transactions, Django will create a new connection and this needs to be manually closed:

from django.db import connection

@postpone
def foo():
  #do stuff
  connection.close()



回答2:


Celery is an asynchronous task queue/job queue. It's well documented and perfect for what you need. I suggest you start here




回答3:


The most common way to do asynchronous processing in Django is to use Celery and django-celery.




回答4:


tomcounsell's approach works well if there are not too many incoming jobs. If many long-lasting jobs are run in short period of time, therefore spawning a lot of threads, the main process will suffer. In this case, you can use a thread pool with a coroutine,

# in my_utils.py

from concurrent.futures import ThreadPoolExecutor

MAX_THREADS = 10


def run_thread_pool():
    """
    Note that this is not a normal function, but a coroutine.
    All jobs are enqueued first before executed and there can be
    no more than 10 threads that run at any time point.
    """
    with ThreadPoolExecutor(max_workers=MAX_THREADS) as executor:
        while True:
            func, args, kwargs = yield
            executor.submit(func, *args, **kwargs)


pool_wrapper = run_thread_pool()

# Advance the coroutine to the first yield (priming)
next(pool_wrapper)
from my_utils import pool_wrapper

def job(*args, **kwargs):
    # do something

def handle(request):
    # make args and kwargs
    pool_wrapper.send((job, args, kwargs))
    # return a response


来源:https://stackoverflow.com/questions/18420699/multithreading-for-python-django

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!