How do I prevent memory leak when I load large pickle files in a for loop?
问题 I have 50 pickle files that are 0.5 GB each. Each pickle file is comprised of a list of custom class objects. I have no trouble loading the files individually using the following function: def loadPickle(fp): with open(fp, 'rb') as fh: listOfObj = pickle.load(fh) return listOfObj However, when I try to iteratively load the files I get a memory leak. l = ['filepath1', 'filepath2', 'filepath3', 'filepath4'] for fp in l: x = loadPickle(fp) print( 'loaded {0}'.format(fp) ) My memory overflows