Ruby - Read file in batches

社会主义新天地 提交于 2019-12-19 17:40:47

问题


I am reading a file that is 10mb in size and which contains some id's. I read them into a list in ruby. I am concerned that it might cause memory issues in the future, when the number of id's in file might increase. Is there a effective way of reading a large file in batches?

Thank you


回答1:


there's no universal way.

1) you can read file by chunks:

File.open('filename','r') do |f|
  chunk = f.read(2048)
  ...
end

disadvantage: you can miss a substring if it'd be between chunks, i.e. you look for "SOME_TEXT", but "SOME_" is a last 5 bytes of 1st 2048-byte chunk, and "TEXT" is a 4 bytes of 2nd chunk

2) you can read file line-by-line

File.open('filename','r') do |f|
  line = f.gets
  ...
end

disadvantage: this way it'd be 2x..5x slower than first method




回答2:


With Lazy Enumerators and each_slice, you can get the best of both worlds. You don't need to worry about cutting lines in the middle, and you can iterate over multiple lines in a batch. batch_size can be chosen freely.

header_lines = 1
batch_size   = 2000

File.open("big_file") do |file|
  file.lazy.drop(header_lines).each_slice(batch_size) do |lines|
    # do something with batch of lines
  end
end

It could be used to import a huge CSV file into a database :

require 'csv'
batch_size   = 2000

File.open("big_data.csv") do |file|
  headers = file.first
  file.lazy.each_slice(batch_size) do |lines|
    csv_rows = CSV.parse(lines.join, write_headers: true, headers: headers)
    # do something with 2000 csv rows, e.g. bulk insert them into a database
  end
end


来源:https://stackoverflow.com/questions/2962134/ruby-read-file-in-batches

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!