Dealing with large files in Haskell

五迷三道 提交于 2019-12-04 19:31:37

问题


I have a large file (4+ gigs) of, lets just say, 4 byte floats. I would like to treat it as List, in the sense that I would like to be able to use map, filter, foldl, etc. However, instead of producing a new list with the output, I would like to write the output back into the file, and thus only have to load a small portion of the file in memory. You could say I what a type called MutableFileList

Has anyone ran into this situation before? Instead of re-inventing the wheel I was wondering if there a Hackish way for dealing with this?


回答1:


You should not treat it as a [Double] or [Float] in memory. What you could do is use one of the list-like packed array types, such as uvector/vector/... in company with mmapFile or readFile to pull chunks of the file in at a time, and process them. Or use a lazy packed array type, equivalent to lazy bytestrings.




回答2:


This should be quite helpful to you. You can use readFile and writeFile for what you need to do, and everything is done lazily. It only keeps things in memory while they are still being used, so you can read, process, and write the file out without blowing up your computer.




回答3:


You might use mmap to map the file to memory and then process it. There is a mmap module that promises to read and write mmaped files and can even work with lazily mapped chunks of files, but I haven't tried it.

The interface for writing to the mapped file seems to be quite low level, so you'd have to build your own abstractions or work with Foreign.Ptr and the like.



来源:https://stackoverflow.com/questions/1925581/dealing-with-large-files-in-haskell

工具导航Map

JSON相关