Python Memory Error encountered when replacing NaN values in large Pandas dataframe
问题 I have a very large pandas dataframe: ~300,000 columns and ~17,520 rows. The pandas dataframe is called result_full . I am attempting to replace all of the strings "NaN" with numpy.nan : result_full.replace(["NaN"], np.nan, inplace = True) Here is where I get MemoryError Is there a memory efficient way to drop these strings in my dataframe? I tried result_full.dropna() but it didn't work because they are technically string that are "NaN" 回答1: One of the issues could be because of using a 32