how can I share a dictionary across multiple processes?

旧时模样 提交于 2019-12-11 04:18:06

问题


I was wondering if it's possible to share the contents of a dictionary across multiple processes. I've been looking at http://docs.python.org/2/library/multiprocessing.html#shared-ctypes-objects but it only describes how to share variables but I haven't figured out how I can share a complete dictionary. I know, I could use pickle to share it via storing it to a file but that seems to not be very efficient esp. cause I'm running this on a system with flash memory... any tips?

Thanks, Ron


回答1:


Use manager from the multiprocessing library.

from multiprocessing import Process, Manager

def f(d):
    for i in range(10000):
        d['blah'] += 1

if __name__ == '__main__':
    manager = Manager()

    d = manager.dict()
    d['blah'] = 0
    procs = [ Process(target=f, args=(d,)) for _ in range(10) ]
    for p in procs:
        p.start()
    for p in procs:
        p.join()

    print d


来源:https://stackoverflow.com/questions/16992579/how-can-i-share-a-dictionary-across-multiple-processes

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!