Why doesn't memory get released to system after large queries (or series of queries) in django?

后端 未结 2 1542
轻奢々
轻奢々 2020-12-02 19:15

First off, DEBUG = False in settings.py, so no, connections[\'default\'].queries is not growing and growing until it uses up all of memory.

2条回答
  •  情话喂你
    2020-12-02 19:47

    I decided to move my comments into an answer to make things clearer.

    Since Python 2.5, the CPython memory allocation tracks internal memory usage by the small object allocator, and attempts to return completely free arenas to the underlying OS. This works most of the time, but the fact that objects can't be moved around in memory means that fragmentation can be a serious problem.

    Try the following experiment (I used 3.2, but 2.5+ should be similar if you use xrange):

    # Create the big lists in advance to avoid skewing the memory counts
    seq1 = [None] * 10**6 # Big list of references to None
    seq2 = seq1[::10]
    
    # Create and reference a lot of smaller lists
    seq1[:] = [[] for x in range(10**6)] # References all the new lists
    seq2[:] = seq1[::10] # Grab a second reference to 10% of the new lists
    
    # Memory fragmentation in action
    seq1[:] = [None] * 10**6 # 90% of the lists are no longer referenced here
    seq2[:] = seq1[::10] # But memory freed only after last 10% are dropped
    

    Note, even if you drop the references to seq1 and seq2, the above sequence will likely leave your Python process holding a lot of extra memory.

    When people talk about PyPy using less memory than CPython, this is a major part of what they're talking about. Because PyPy doesn't use direct pointer references under the hood, it is able to use a compacting GC, thus avoiding much of the fragmentation problem and more reliably returning memory to the OS.

提交回复
热议问题