Maintaining Logging and/or stdout/stderr in Python Daemon

后端 未结 3 2108
长情又很酷
长情又很酷 2020-12-14 19:02

Every recipe that I\'ve found for creating a daemon process in Python involves forking twice (for Unix) and then closing all open file descriptors. (See http://www.jejik.com

相关标签:
3条回答
  • 2020-12-14 19:09

    We just had a similar issue, and due to some things beyond my control, the daemon stuff was separate from the stuff creating the logger. However, logger has a .handlers and .parent attributes that make it possible with something like:

        self.files_preserve = self.getLogFileHandles(self.data.logger)
    
    def getLogFileHandles(self,logger):
        """ Get a list of filehandle numbers from logger
            to be handed to DaemonContext.files_preserve
        """
        handles = []
        for handler in logger.handlers:
            handles.append(handler.stream.fileno())
        if logger.parent:
            handles += self.getLogFileHandles(logger.parent)
        return handles
    
    0 讨论(0)
  • 2020-12-14 19:15

    You can simplify the code for this if you set up your logging handler objects separately from your root logger object, and then add the handler objects as an independent step rather than doing it all at one time. The following should work for you.

    import daemon
    import logging
    
    logger = logging.getLogger()
    logger.setLevel(logging.DEBUG)
    fh = logging.FileHandler("./foo.log")
    logger.addHandler(fh)
    
    context = daemon.DaemonContext(
       files_preserve = [
          fh.stream,
       ],
    )
    
    logger.debug( "Before daemonizing." )
    context.open()
    logger.debug( "After daemonizing." )
    
    0 讨论(0)
  • 2020-12-14 19:23

    I use the python-daemon library for my daemonization behavior.

    Interface described here:

    • http://www.python.org/dev/peps/pep-3143/

    Implementation here:

    • http://pypi.python.org/pypi/python-daemon/

    It allows specifying a files_preserve argument, to indicate any file descriptors that should not be closed when daemonizing.

    If you need logging via the same Handler instances before and after daemonizing, you can:

    1. First set up your logging Handlers using basicConfig or dictConfig or whatever.
    2. Log stuff
    3. Determine what file descriptors your Handlers depend on. Unfortunately this is dependent on the Handler subclass. If your first-installed Handler is a StreamHandler, it's the value of logging.root.handlers[0].stream.fileno(); if your second-installed Handler is a SyslogHandler, you want the value of logging.root.handlers[1].socket.fileno(); etc. This is messy :-(
    4. Daemonize your process by creating a DaemonContext with files_preserve equal to a list of the file descriptors you determined in step 3.
    5. Continue logging; your log files should not have been closed during the double-fork.

    An alternative might be, as @Exelian suggested, to actually use different Handler instances before and after the daemonziation. Immediately after daemonizing, destroy the existing handlers (by deling them from logger.root.handlers?) and create identical new ones; you can't just re-call basicConfig because of the issue that @dave-mankoff pointed out.

    0 讨论(0)
提交回复
热议问题