Celery Worker Database Connection Pooling

前端 未结 6 1015
萌比男神i
萌比男神i 2020-12-07 19:17

I am using Celery standalone (not within Django). I am planning to have one worker task type running on multiple physical machines. The task does the following

    <
6条回答
  •  一向
    一向 (楼主)
    2020-12-07 19:54

    Contribute back my findings by implementing and monitoring.

    Welcome feedback.

    Reference: use pooling http://www.prschmid.com/2013/04/using-sqlalchemy-with-celery-tasks.html

    Each worker process (prefork mode specified by -c k) will establish one new connection to DB without pooling or reusing. So if using pooling, the pool is seen only at each worker process level. So pool size > 1 is not useful, but reusing connection is still fine for saving connection from open & close.

    If using one connection per worker process, 1 DB connection is established per worker process (prefork mode celery -A app worker -c k) at initialization phase. It saves connection from open & close repeatedly.

    No matter how many worker thread (eventlet), each worker thread (celery -A app worker -P eventlet) only establish one connection to DB without pooling or reusing. So for eventlet, all worker threads (eventlets) on one celery process (celery -A app worker ...) have 1 db connection at each moment.

    According to celery docs

    but you need to ensure your tasks do not perform blocking calls, as this will halt all other operations in the worker until the blocking call returns.

    It is probably due to the way of MYSQL DB connection is blocking calls.

提交回复
热议问题