python-multiprocessing

Spawn parallel processes for function and pass several different arguments to function

南楼画角 提交于 2020-01-06 18:54:18
问题 Hi everybody I took Jiaaro's solution as a template to convert it from threading to multiprocessing: import multiprocessing from function_repo import run from time import time vitems = ['02','63','25','0'] num_processes = (multiprocessing.cpu_count()/1) threads = [] if __name__ == '__main__': begin = time() print begin # run until all the threads are done, and there is no data left while threads or vitems: if( len(threads) < (num_processes -1) ): p = multiprocessing.Process(target=run,args=

How can I implement an `input` method in a Tkinter parent script, with the displayed prompt and return value being sent back to a child script?

二次信任 提交于 2020-01-06 08:04:16
问题 I have two scripts: Processor_child.py : Its purpose is to perform a number of data analysis and cleaning operations. This must perform the same operations when run alone (without Tkinter_parent.py) as it does when packaged into a GUI with Tkinter_parent.py. Tkinter_parent.py : Its purpose is to provide a GUI for those who can't use Processor_child directly. Where I'm struggling is to reproduce the python input function from Processor_child.py in the instance where these two are used together

How can I implement an `input` method in a Tkinter parent script, with the displayed prompt and return value being sent back to a child script?

旧街凉风 提交于 2020-01-06 08:03:29
问题 I have two scripts: Processor_child.py : Its purpose is to perform a number of data analysis and cleaning operations. This must perform the same operations when run alone (without Tkinter_parent.py) as it does when packaged into a GUI with Tkinter_parent.py. Tkinter_parent.py : Its purpose is to provide a GUI for those who can't use Processor_child directly. Where I'm struggling is to reproduce the python input function from Processor_child.py in the instance where these two are used together

Requests module crashes python when numpy is loaded and using process

ⅰ亾dé卋堺 提交于 2020-01-05 13:03:18
问题 Strange title I know, but it is exactly what I see. I am trying to run a requests (2.13.0) command from within a forked process (Mac OSX) using the multiprocessing module. I also happen to use numpy in my code (1.15.1) running on python 3.7. Here are my observations (see code below): 1) Without importing numpy: All works fine 2) Once I import numpy: Code crashes on starting of the forked process. Message given is: objc[45539]: +[__NSPlaceholderDate initialize] may have been in progress in

Requests module crashes python when numpy is loaded and using process

风格不统一 提交于 2020-01-05 13:03:01
问题 Strange title I know, but it is exactly what I see. I am trying to run a requests (2.13.0) command from within a forked process (Mac OSX) using the multiprocessing module. I also happen to use numpy in my code (1.15.1) running on python 3.7. Here are my observations (see code below): 1) Without importing numpy: All works fine 2) Once I import numpy: Code crashes on starting of the forked process. Message given is: objc[45539]: +[__NSPlaceholderDate initialize] may have been in progress in

Parallelize for loop in python

北城以北 提交于 2020-01-05 05:50:08
问题 I have a genetic algorithm which I would like to speed up. I'm thinking the easiest way to achieve this is by pythons multiprocessing module. After running cProfile on my GA, I found out that most of the computational time takes place in the evaluation function. def evaluation(): scores = [] for chromosome in population: scores.append(costly_function(chromosome)) How would I go about to parallelize this method? It is important that all the scores append in the same order as they would if the

Python multiprocessing: object passed by value?

狂风中的少年 提交于 2020-01-04 06:04:49
问题 I have been trying the following: from multiprocessing import Pool def f(some_list): some_list.append(4) print 'Child process: new list = ' + str(some_list) return True if __name__ == '__main__': my_list = [1, 2, 3] pool = Pool(processes=4) result = pool.apply_async(f, [my_list]) result.get() print 'Parent process: new list = ' + str(my_list) What I get is: Child process: new list = [1, 2, 3, 4] Parent process: new list = [1, 2, 3] So, it means that the my_list was passed by value since it

Python multiprocessing: object passed by value?

|▌冷眼眸甩不掉的悲伤 提交于 2020-01-04 06:04:07
问题 I have been trying the following: from multiprocessing import Pool def f(some_list): some_list.append(4) print 'Child process: new list = ' + str(some_list) return True if __name__ == '__main__': my_list = [1, 2, 3] pool = Pool(processes=4) result = pool.apply_async(f, [my_list]) result.get() print 'Parent process: new list = ' + str(my_list) What I get is: Child process: new list = [1, 2, 3, 4] Parent process: new list = [1, 2, 3] So, it means that the my_list was passed by value since it

python multiprocessing/threading code exits early

蹲街弑〆低调 提交于 2020-01-04 05:51:21
问题 I'm trying to create multiple processes which each call multiple threads. I'm running the following code with python3.5 A simplified example of the problem looks like this: import multiprocessing import time import threading class dumb(threading.Thread): def __init__(self): super(dumb, self).__init__() def run(self): while True: print("hi") time.sleep(1) def test(): for i in range(2): bar = dumb() bar.start() def main(): p = [] for i in range(2): p.append(multiprocessing.Process(target=test))

How to terminate a multiprocess in python when a given condition is met? [duplicate]

两盒软妹~` 提交于 2020-01-03 02:24:08
问题 This question already has answers here : Terminate a Python multiprocessing program once a one of its workers meets a certain condition (4 answers) Closed 2 years ago . Let's say I have the function: def f(): while True: x = generate_something() if x == condition: return x if __name__ == '__main__': p=Pool(4) I want to run this function in a multiprocess and when one of the processes meets my function's condition, I want all other processes to stop. 回答1: You can use event and terminate in