问题
I'm trying to start 6 threads each taking an item from the list files, removing it, then printing the value.
from multiprocessing import Pool
files = ['a','b','c','d','e','f']
def convert(file):
process_file = files.pop()
print process_file
if __name__ == '__main__':
pool = Pool(processes=6)
pool.map(convert,range(6))
The expected output should be:
a
b
c
d
e
f
Instead, the output is:
f
f
f
f
f
f
What's going on? Thanks in advance.
回答1:
Part of the issue is that you are not dealing with the multiprocess nature of pool, (note that in Python, MultiThreading does not gain performance due to Global Interpreter Lock).
Is there a reason why you need to alter the original list? You current code does not use the iterable passed in, and instead edits a shared mutable object, which is DANGEROUS in the world of concurrency. A simple solution is as follows:
from multiprocessing import Pool
files = ['a','b','c','d','e','f']
def convert(aFile):
print aFile
if __name__ == '__main__':
pool = Pool() #note the default will use the optimal number of workers
pool.map(convert,files)
Your question really got me thinking, so I did a little more exploration to understand why Python behaves in this way. It seems that Python is doing some interesting black magic and deepcopying (while maintain the id, which is non-standard) the object into the new process. This can be seen by altering the number or processes used:
from multiprocessing import Pool
files = ['d','e','f','a','b','c',]
a = sorted(files)
def convert(_):
print a == files
files.sort()
#print id(files) #note this is the same for every process, which is interesting
if __name__ == '__main__':
pool = Pool(processes=1) #
pool.map(convert,range(6))
==> all but the first invocation print 'True' as expected.
If you set the number or processes to 2, it is less deterministic, as it depends on which process actually executes their statement(s) first.
回答2:
One solution is to use multiprocessing.dummy which uses threads instead of processes simply changing your import to:
from multiprocessing.dummy import Pool
"solves" the problem, but doesn't protect the shared memory against concurrent accesses.
You should still use a threading.Lock or a Queue with put and get
来源:https://stackoverflow.com/questions/8626107/use-python-pool-map-to-have-multiple-processes-perform-operations-on-a-list