I have found a solution but it is really slow:
def chunks(self,data, SIZE=10000):
for i in xrange(0, len(data), SIZE):
yield dict(data.items()[i:
Since the dictionary is so big, it would be better to keep all the items involved to be just iterators and generators, like this
from itertools import islice
def chunks(data, SIZE=10000):
it = iter(data)
for i in xrange(0, len(data), SIZE):
yield {k:data[k] for k in islice(it, SIZE)}
Sample run:
for item in chunks({i:i for i in xrange(10)}, 3):
print item
Output
{0: 0, 1: 1, 2: 2}
{3: 3, 4: 4, 5: 5}
{8: 8, 6: 6, 7: 7}
{9: 9}