Profile Memory Allocation in Python (with support for Numpy arrays)

后端 未结 5 1208
自闭症患者
自闭症患者 2020-12-13 10:03

I have a program that contains a large number of objects, many of them Numpy arrays. My program is swapping miserably, and I\'m trying to reduce the memory usage, because it

5条回答
  •  [愿得一人]
    2020-12-13 11:04

    Have a look at memory profiler. It provides line by line profiling and Ipython integration, which makes it very easy to use it:

    In [1]: import numpy as np
    
    In [2]: %memit np.zeros(1e7)
    maximum of 3: 70.847656 MB per loop
    

    Update

    As mentioned by @WickedGrey there seems to be a bug (see github issue tracker) when calling a function more than one time, which I can reproduce:

    In [2]: for i in range(10):
       ...:     %memit np.zeros(1e7)
       ...:     
    maximum of 1: 70.894531 MB per loop
    maximum of 1: 70.894531 MB per loop
    maximum of 1: 70.894531 MB per loop
    maximum of 1: 70.894531 MB per loop
    maximum of 1: 70.894531 MB per loop
    maximum of 1: 70.894531 MB per loop
    maximum of 1: 70.902344 MB per loop
    maximum of 1: 70.902344 MB per loop
    maximum of 1: 70.902344 MB per loop
    maximum of 1: 70.902344 MB per loop
    

    However I don't know to what extend the results maybe influenced (seems to be not that much in my example, so depending on your use case it maybe still useful) and when this issue maybe fixed. I asked that at github.

提交回复
热议问题