Python load 2GB of text file to memory

让人想犯罪 __ 提交于 2019-12-21 12:58:46

问题


In Python 2.7, when I load all data from a text file of 2.5GB into memory for quicker processing like this:

>>> f = open('dump.xml','r')
>>> dump = f.read()

I got the following error:

Python(62813) malloc: *** mmap(size=140521659486208) failed (error code=12)
*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
MemoryError

Why did Python try to allocate 140521659486208 bytes memory for 2563749237 bytes data? How do I fix the code to make it loads all the bytes?

I'm having around 3GB RAM free. The file is a Wiktionary xml dump.


回答1:


If you use mmap, you'll be able to load the entire file into memory immediately.

import mmap

with open('dump.xml', 'rb') as f:
  # Size 0 will read the ENTIRE file into memory!
  m = mmap.mmap(f.fileno(), 0, prot=mmap.PROT_READ) #File is open read-only

  # Proceed with your code here -- note the file is already in memory
  # so "readine" here will be as fast as could be
  data = m.readline()
  while data:
    # Do stuff
    data = m.readline()



回答2:


Based on some quick googling, I came across this forum post that seems to address the issue that you appear to be having. Assuming that you are running Mac or Linux based on the error code, you may try implementing garbage collection with gc.enable() or gc.collect() as suggested in the forum post.



来源:https://stackoverflow.com/questions/11159077/python-load-2gb-of-text-file-to-memory

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!