What's the maximum size of a numpy array?

前端 未结 4 2024
猫巷女王i
猫巷女王i 2020-12-06 04:27

I\'m trying to create a matrix containing 2 708 000 000 elements. When I try to create a numpy array of this size it gives me a value error. Is there any way I can increase

4条回答
  •  太阳男子
    2020-12-06 04:58

    I was able to create an array with a size of 6Billion that ate up 45GB of memory. By default, numpy created the array with a dtype of float64. By dropping the precision, I was able to save a lot of memory.

    np.arange(6000000000,dtype=np.dtype('f8'))
    np.arange(6000000000,dtype=np.dtype('f4'))
    #etc...
    

    default == float64

    • np.float64 -- 45.7GB

    • np.float32 -- 22.9GB

    • np.int8 -- 5.7GB

    Obviously a 8bit integer cant store a value of 6B. I'm sure a max size exists at some point but I suspect it's FAR past anything possible in 2016. Interestingly, "Python Blaze" allows you to create numpy arrays on disk. I recall playing with it some time ago and creating an extremely large array that took up 1TB of disk.

提交回复
热议问题