Notify celery task of worker shutdown

前端 未结 1 1954
天命终不由人
天命终不由人 2021-01-05 00:28

I am using celery 2.4.1 with python 2.6, the rabbitmq backend, and django. I would like my task to be able to clean up properly if the worker shuts down. As far as I am awa

1条回答
  •  长发绾君心
    2021-01-05 00:42

    worker_shutdown is only sent by the MainProcess, not the child pool workers. All worker_* signals except for worker_process_init, refer to the MainProcess.

    However, the shutdown hook never gets called. Ctrl-C'ing the worker doesn't kill the task and I have to manually kill it from the shell.

    The worker never terminates a task under normal (warm) shutdown. Even if a task takes days to complete, the worker won't complete shutdown until it's completed. You can set --soft-time-limit, or --time-limit to to tell the instance when it's ok to terminate the task.

    So to add any kind of process cleanup process you first need to make sure that the tasks can actually complete. As the cleanup wouldn't be called before that happens.

    To add a cleanup step to the pool worker processes you can use something like:

    from celery import platforms
    from celery.signals import worker_process_init
    
    def cleanup_after_tasks(signum, frame):
        # reentrant code here (see http://docs.python.org/library/signal.html)
    
    def install_pool_process_sighandlers(**kwargs):
        platforms.signals["TERM"] = cleanup_after_tasks
        platforms.signals["INT"] = cleanup_after_tasks
    
    worker_process_init.connect(install_pool_process_sighandlers)
    

    0 讨论(0)
提交回复
热议问题