I have found a solution but it is really slow:
def chunks(self,data, SIZE=10000):
for i in xrange(0, len(data), SIZE):
yield dict(data.items()[i:
import numpy as np
chunk_size = 3
chunked_data = [[k, v] for k, v in d.items()]
chunked_data = np.array_split(chunked_data, chunk_size)
Afterwards you have ndarray
which is iterable like this:
for chunk in chunked_data:
for key, value in chunk:
print(key)
print(value)
Which could be re-assigned to a list of dicts using a simple for loop.