I have been looking for an elegant and efficient way to chunk a string into substrings of a given length in Ruby.
So far, the best I could come up with is this:
Are there some other constraints you have in mind? Otherwise I'd be awfully tempted to do something simple like
[0..10].each {
str[(i*w),w]
}
Just text.scan(/.{1,4}/m)
resolves the problem
A better solution which takes into account the last part of the string which could be less than the chunk size:
def chunk(inStr, sz)
return [inStr] if inStr.length < sz
m = inStr.length % sz # this is the last part of the string
partial = (inStr.length / sz).times.collect { |i| inStr[i * sz, sz] }
partial << inStr[-m..-1] if (m % sz != 0) # add the last part
partial
end
Use String#scan
:
>> 'abcdefghijklmnopqrstuvwxyz'.scan(/.{4}/)
=> ["abcd", "efgh", "ijkl", "mnop", "qrst", "uvwx"]
>> 'abcdefghijklmnopqrstuvwxyz'.scan(/.{1,4}/)
=> ["abcd", "efgh", "ijkl", "mnop", "qrst", "uvwx", "yz"]
>> 'abcdefghijklmnopqrstuvwxyz'.scan(/.{1,3}/)
=> ["abc", "def", "ghi", "jkl", "mno", "pqr", "stu", "vwx", "yz"]
Here is another one solution for slightly different case, when processing large strings and there is no need to store all chunks at a time. In this way it stores single chunk at a time and performs much faster than slicing strings:
io = StringIO.new(string)
until io.eof?
chunk = io.read(chunk_size)
do_something(chunk)
end
I made a little test that chops about 593MB data into 18991 32KB pieces. Your slice+map version ran for at least 15 minutes using 100% CPU before I pressed ctrl+C. This version using String#unpack finished in 3.6 seconds:
def chunk(string, size)
string.unpack("a#{size}" * (string.size/size.to_f).ceil)
end