Multithreading for Python Django

前端 未结 4 1166
情深已故
情深已故 2020-12-12 14:53

Some functions should run asynchronously on the web server. Sending emails or data post-processing are typical use cases.

What is the best (or most pythonic) way wri

相关标签:
4条回答
  • 2020-12-12 15:37

    I've continued using this implementation at scale and in production with no issues.

    Decorator definition:

    def start_new_thread(function):
        def decorator(*args, **kwargs):
            t = Thread(target = function, args=args, kwargs=kwargs)
            t.daemon = True
            t.start()
        return decorator
    

    Example usage:

    @start_new_thread
    def foo():
      #do stuff
    

    Over time, the stack has updated and transitioned without fail.

    Originally Python 2.4.7, Django 1.4, Gunicorn 0.17.2, now Python 3.6, Django 2.1, Waitress 1.1.

    If you are using any database transactions, Django will create a new connection and this needs to be manually closed:

    from django.db import connection
    
    @postpone
    def foo():
      #do stuff
      connection.close()
    
    0 讨论(0)
  • 2020-12-12 15:45

    tomcounsell's approach works well if there are not too many incoming jobs. If many long-lasting jobs are run in short period of time, therefore spawning a lot of threads, the main process will suffer. In this case, you can use a thread pool with a coroutine,

    # in my_utils.py
    
    from concurrent.futures import ThreadPoolExecutor
    
    MAX_THREADS = 10
    
    
    def run_thread_pool():
        """
        Note that this is not a normal function, but a coroutine.
        All jobs are enqueued first before executed and there can be
        no more than 10 threads that run at any time point.
        """
        with ThreadPoolExecutor(max_workers=MAX_THREADS) as executor:
            while True:
                func, args, kwargs = yield
                executor.submit(func, *args, **kwargs)
    
    
    pool_wrapper = run_thread_pool()
    
    # Advance the coroutine to the first yield (priming)
    next(pool_wrapper)
    
    from my_utils import pool_wrapper
    
    def job(*args, **kwargs):
        # do something
    
    def handle(request):
        # make args and kwargs
        pool_wrapper.send((job, args, kwargs))
        # return a response
    
    0 讨论(0)
  • 2020-12-12 15:49

    Celery is an asynchronous task queue/job queue. It's well documented and perfect for what you need. I suggest you start here

    0 讨论(0)
  • 2020-12-12 15:57

    The most common way to do asynchronous processing in Django is to use Celery and django-celery.

    0 讨论(0)
提交回复
热议问题