How to estimate how much memory a Pandas' DataFrame will need?

前端 未结 7 603
谎友^
谎友^ 2020-11-30 18:49

I have been wondering... If I am reading, say, a 400MB csv file into a pandas dataframe (using read_csv or read_table), is there any way to guesstimate how much memory this

7条回答
  •  孤城傲影
    2020-11-30 19:35

    df.memory_usage() will return how many bytes each column occupies:

    >>> df.memory_usage()
    
    Row_ID            20906600
    Household_ID      20906600
    Vehicle           20906600
    Calendar_Year     20906600
    Model_Year        20906600
    ...
    

    To include indexes, pass index=True.

    So to get overall memory consumption:

    >>> df.memory_usage(index=True).sum()
    731731000
    

    Also, passing deep=True will enable a more accurate memory usage report, that accounts for the full usage of the contained objects.

    This is because memory usage does not include memory consumed by elements that are not components of the array if deep=False (default case).

提交回复
热议问题