Globals variables and Python multiprocessing [duplicate]

淺唱寂寞╮ 提交于 2019-12-18 05:37:28

问题


Possible Duplicate:
Python multiprocessing global variable updates not returned to parent

I am using a computer with many cores and for performance benefits I should really use more than one. However, I'm confused why these bits of code don't do what I expect:

from multiprocessing import Process

var = range(5)
def test_func(i):
    global var
    var[i] += 1

if __name__ == '__main__':
    jobs = []
    for i in xrange(5):
        p = Process(target=test_func,args=(i,))
        jobs.append(p)
        p.start()

print var

As well as

from multiprocessing import Pool

var = range(5)
def test_func(i):
    global var
    var[i] += 1

if __name__ == '__main__':
    p = Pool()
    for i in xrange(5):
        p.apply_async(test_func,[i])

print var

I expect the result to be [1, 2, 3, 4, 5] but the result is [0, 1, 2, 3, 4].

There must be some subtlety I'm missing in using global variables with processes. Is this even the way to go or should I avoid trying to change a variable in this manner?


回答1:


If you are running two separate processes, then they won't be sharing the same globals. If you want to pass the data between the processes, look at using send and recv. Take a look at http://docs.python.org/library/multiprocessing.html#sharing-state-between-processes for an example similar to what you're doing.



来源:https://stackoverflow.com/questions/11215554/globals-variables-and-python-multiprocessing

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!