I have found a solution but it is really slow:
def chunks(self,data, SIZE=10000):
for i in xrange(0, len(data), SIZE):
yield dict(data.items()[i:
This code takes a large dictionary and splits it into a list of small dictionaries. max_limit variable is to tell maximum number of key-value pairs allowed in a sub-dictionary. This code doesn't take much effort to break the dictionary, just one complete parsing over the dictionary object.
import copy
def split_dict_to_multiple(input_dict, max_limit=200):
"""Splits dict into multiple dicts with given maximum size.
Returns a list of dictionaries."""
chunks = []
curr_dict ={}
for k, v in input_dict.items():
if len(curr_dict.keys()) < max_limit:
curr_dict.update({k: v})
else:
chunks.append(copy.deepcopy(curr_dict))
curr_dict = {k: v}
# update last curr_dict
chunks.append(curr_dict)
return chunks