Force garbage collection in Python to free memory

前端 未结 2 1953
刺人心
刺人心 2020-12-28 22:40

I have a Python2.7 App which used lots of dict objects which mostly contain strings for keys and values.

Sometimes those dicts and strings are not neede

2条回答
  •  不思量自难忘°
    2020-12-28 23:24

    Frederick Lundh explains,

    If you create a large object and delete it again, Python has probably released the memory, but the memory allocators involved don’t necessarily return the memory to the operating system, so it may look as if the Python process uses a lot more virtual memory than it actually uses.

    and Alex Martelli writes:

    The only really reliable way to ensure that a large but temporary use of memory DOES return all resources to the system when it's done, is to have that use happen in a subprocess, which does the memory-hungry work then terminates.

    So, you could use multiprocessing to spawn a subprocess, perform the memory-hogging calculation, and then ensure the memory is released when the subprocess terminates:

    import multiprocessing as mp
    import resource
    
    def mem():
        print('Memory usage         : % 2.2f MB' % round(
            resource.getrusage(resource.RUSAGE_SELF).ru_maxrss/1024.0,1)
        )
    
    mem()
    
    def memoryhog():
        print('...creating list of dicts...')
        n = 10**5
        l = []
        for i in xrange(n):
            a = 1000*'a'
            b = 1000*'b'
            l.append({ 'a' : a, 'b' : b })
        mem()
    
    proc = mp.Process(target=memoryhog)
    proc.start()
    proc.join()
    
    mem()
    

    yields

    Memory usage         :  5.80 MB
    ...creating list of dicts...
    Memory usage         :  234.20 MB
    Memory usage         :  5.90 MB
    

提交回复
热议问题