Multiprocessing - Shared Array

前端 未结 2 600
小鲜肉
小鲜肉 2021-01-05 19:26

So I\'m trying to implement multiprocessing in python where I wish to have a Pool of 4-5 processes running a method in parallel. The purpose of this is to run a total of tho

2条回答
  •  轮回少年
    2021-01-05 19:42

    Since you're only returning state from the child process to the parent process, then using a shared array and explicity locks is overkill. You can use Pool.map or Pool.starmap to accomplish exactly what you need. For example:

    from multiprocessing import Pool
    
    class Adder:
        """I'm using this class in place of a monte carlo simulator"""
    
        def add(self, a, b):
            return a + b
    
    def setup(x, y, z):
        """Sets up the worker processes of the pool. 
        Here, x, y, and z would be your global settings. They are only included
        as an example of how to pass args to setup. In this program they would
        be "some arg", "another" and 2
        """
        global adder
        adder = Adder()
    
    def job(a, b):
        """wrapper function to start the job in the child process"""
        return adder.add(a, b)
    
    if __name__ == "__main__":   
        args = list(zip(range(10), range(10, 20)))
        # args == [(0, 10), (1, 11), ..., (8, 18), (9, 19)]
    
        with Pool(initializer=setup, initargs=["some arg", "another", 2]) as pool:
            # runs jobs in parallel and returns when all are complete
            results = pool.starmap(job, args)
    
        print(results) # prints [10, 12, ..., 26, 28] 
    

提交回复
热议问题