Python, want logging with log rotation and compression

后端 未结 9 1176
暖寄归人
暖寄归人 2020-11-28 01:43

Can anyone suggest a way in python to do logging with:

  • log rotation every day
  • compression of logs when they\'re rotated
  • optional - delete old
9条回答
  •  一个人的身影
    2020-11-28 02:23

    • log rotation every day: Use a TimedRotatingFileHandler
    • compression of logs: Set the encoding='bz2' parameter. (Note this "trick" will only work for Python2. 'bz2' is no longer considered an encoding in Python3.)
    • optional - delete oldest log file to preserve X MB of free space. You could (indirectly) arrange this using a RotatingFileHandler. By setting the maxBytes parameter, the log file will rollover when it reaches a certain size. By setting the backupCount parameter, you can control how many rollovers are kept. The two parameters together allow you to control the maximum space consumed by the log files. You could probably subclass the TimeRotatingFileHandler to incorporate this behavior into it as well.

    Just for fun, here is how you could subclass TimeRotatingFileHandler. When you run the script below, it will write log files to /tmp/log_rotate*.

    With a small value for time.sleep (such as 0.1), the log files fill up quickly, reach the maxBytes limit, and are then rolled over.

    With a large time.sleep (such as 1.0), the log files fill up slowly, the maxBytes limit is not reached, but they roll over anyway when the timed interval (of 10 seconds) is reached.

    All the code below comes from logging/handlers.py. I simply meshed TimeRotatingFileHandler with RotatingFileHandler in the most straight-forward way possible.

    import time
    import re
    import os
    import stat
    import logging
    import logging.handlers as handlers
    
    
    class SizedTimedRotatingFileHandler(handlers.TimedRotatingFileHandler):
        """
        Handler for logging to a set of files, which switches from one file
        to the next when the current file reaches a certain size, or at certain
        timed intervals
        """
    
        def __init__(self, filename, maxBytes=0, backupCount=0, encoding=None,
                     delay=0, when='h', interval=1, utc=False):
            handlers.TimedRotatingFileHandler.__init__(
                self, filename, when, interval, backupCount, encoding, delay, utc)
            self.maxBytes = maxBytes
    
        def shouldRollover(self, record):
            """
            Determine if rollover should occur.
    
            Basically, see if the supplied record would cause the file to exceed
            the size limit we have.
            """
            if self.stream is None:                 # delay was set...
                self.stream = self._open()
            if self.maxBytes > 0:                   # are we rolling over?
                msg = "%s\n" % self.format(record)
                # due to non-posix-compliant Windows feature
                self.stream.seek(0, 2)
                if self.stream.tell() + len(msg) >= self.maxBytes:
                    return 1
            t = int(time.time())
            if t >= self.rolloverAt:
                return 1
            return 0
    
    
    def demo_SizedTimedRotatingFileHandler():
        log_filename = '/tmp/log_rotate'
        logger = logging.getLogger('MyLogger')
        logger.setLevel(logging.DEBUG)
        handler = SizedTimedRotatingFileHandler(
            log_filename, maxBytes=100, backupCount=5,
            when='s', interval=10,
            # encoding='bz2',  # uncomment for bz2 compression
        )
        logger.addHandler(handler)
        for i in range(10000):
            time.sleep(0.1)
            logger.debug('i=%d' % i)
    
    demo_SizedTimedRotatingFileHandler()
    

提交回复
热议问题