Celery Beat: Limit to single task instance at a time

后端 未结 5 1801
不思量自难忘°
不思量自难忘° 2020-12-31 17:30

I have celery beat and celery (four workers) to do some processing steps in bulk. One of those tasks is roughly along the lines of, \"for each X that hasn\'t had a Y created

5条回答
  •  一个人的身影
    2020-12-31 18:01

    I took a crack at writing a decorator to use Postgres advisory locking similar to what erydo alluded to in his comment.

    It's not very pretty, but seems to work correctly. This is with SQLAlchemy 0.9.7 under Python 2.7.

    from functools import wraps
    from sqlalchemy import select, func
    
    from my_db_module import Session # SQLAlchemy ORM scoped_session
    
    def pg_locked(key):
        def decorator(f):
            @wraps(f)
            def wrapped(*args, **kw):
                session = db.Session()
                try:
                    acquired, = session.execute(select([func.pg_try_advisory_lock(key)])).fetchone()
                    if acquired:
                        return f(*args, **kw)
                finally:
                    if acquired:
                        session.execute(select([func.pg_advisory_unlock(key)]))
            return wrapped
        return decorator
    
    @app.task
    @pg_locked(0xdeadbeef)
    def singleton_task():
        # only 1x this task can run at a time
        pass
    

    (Would welcome any comments on ways to improve this!)

提交回复
热议问题