Python in-memory cache with time to live

后端 未结 8 1543
后悔当初
后悔当初 2020-12-04 23:24

I have multiple threads running the same process that need to be able to to notify each other that something should not be worked on for the next n seconds its not the end o

8条回答
  •  时光说笑
    2020-12-05 00:22

    I know this is a little old, but for those who are interested in no third-party dependencies, this is a minor wrapper around the builtin functools.lru_cache (I noticed Javier's similar answer after writing this, but figured I post it anyway since this doesn't require Django):

    import functools
    import time
    
    
    def time_cache(max_age, maxsize=128, typed=False):
        """Least-recently-used cache decorator with time-based cache invalidation.
    
        Args:
            max_age: Time to live for cached results (in seconds).
            maxsize: Maximum cache size (see `functools.lru_cache`).
            typed: Cache on distinct input types (see `functools.lru_cache`).
        """
        def _decorator(fn):
            @functools.lru_cache(maxsize=maxsize, typed=typed)
            def _new(*args, __time_salt, **kwargs):
                return fn(*args, **kwargs)
    
            @functools.wraps(fn)
            def _wrapped(*args, **kwargs):
                return _new(*args, **kwargs, __time_salt=int(time.time() / max_age))
    
            return _wrapped
    
        return _decorator
    

    And its usage:

    @time_cache(10)
    def expensive(a: int):
        """An expensive function."""
        time.sleep(1 + a)
    
    
    print("Starting...")
    expensive(1)
    print("Again...")
    expensive(1)
    print("Done")
    

    NB this uses time.time and comes with all its caveats. You may want to use time.monotonic instead if available/appropriate.

提交回复
热议问题