I have created a dictionary in python and dumped into pickle. Its size went to 300MB. Now, I want to load the same pickle.
output = open(\'myfile.pkl\', \'rb
If you are trying to store the dictionary to a single file, it's the load time for the large file that is slowing you down. One of the easiest things you can do is to write the dictionary to a directory on disk, with each dictionary entry being an individual file. Then you can have the files pickled and unpickled in multiple threads (or using multiprocessing). For a very large dictionary, this should be much faster than reading to and from a single file, regardless of the serializer you choose. There are some packages like klepto and joblib that already do much (if not all of the above) for you. I'd check those packages out. (Note: I am the klepto author. See https://github.com/uqfoundation/klepto).