Read large file in parallel?

后端 未结 6 1890
被撕碎了的回忆
被撕碎了的回忆 2020-12-05 04:50

I have a large file which I need to read in and make a dictionary from. I would like this to be as fast as possible. However my code in python is too slow. Here is a minim

6条回答
  •  渐次进展
    2020-12-05 05:32

    If your data on file, is not changing so often, you can choose to serialize it. Python interpreter will deserialize it much more quickly. You can use cPickle module.

    Or creating 8 separate processes is an other option. Because, having an only dict make it much more possible. You can interact between those processes via Pipe in "multiprocessing" module or, "socket" module.

    Best regards

    Barış ÇUHADAR.

提交回复
热议问题