I have a list of unknown number of items, let\'s say 26. let\'s say
list=[\'a\',\'b\',\'c\',\'d\',\'e\',\'f\',\'g\',\'h\',
\'i\',\'j\',\'k\',\'l\',\'m\',\'n\',\'
If you're working on an iterable (like a file) which is potentially too big to fit in RAM you can use something like this:
from itertools import izip_longest
def chunker(iterable, n, fillvalue=None):
"""Like an itertools.grouper but None values are excluded.
>>> list(chunker([1, 2, 3, 4, 5], 3))
[[1, 2, 3], [4, 5]]
"""
if n < 1:
raise ValueError("can't chunk by n=%d" % n)
args = [iter(iterable)] * n
return (
[e for e in t if e is not None]
for t in izip_longest(*args, fillvalue=fillvalue)
)
with open('some_file') as f:
for row in chunker(f, 5):
print "".join(row)
If RAM is not a consideration the answer by ZdaR is preferable as it is considerably faster.