python-multiprocessing

Parallelizing comparisons between two dataframes with multiprocessing

做~自己de王妃 提交于 2021-02-10 15:57:04
问题 I've got the following function that allows me to do some comparison between the rows of two dataframes ( data and ref )and return the index of both rows if there's a match. def get_gene(row): m = np.equal(row[0], ref.iloc[:,0].values) & np.greater_equal(row[2], ref.iloc[:,2].values) & np.less_equal(row[3], ref.iloc[:,3].values) return ref.index[m] if m.any() else None Being a process that takes time (25min for 1.6M rows in data versus 20K rows in ref ), I tried to speed things up by

Updating matplotlib plot in tkinter gui via another process

吃可爱长大的小学妹 提交于 2021-02-10 10:13:04
问题 I have an interface that I am writing using tkinter. Inside that interface, I have a matplotlib plot. In the background, I have zmq connecting to a server to get data. I am using python's multiprocessing module with a Queue sharing data and a Process running for all of the zmq communication. In the GUI, I can set up a button to update the plot with the following function: def addData(self): if not self.data.empty(): print "Have Data!" self.hist.addQueue(self.data) self.hist.updatePlot() else:

Tkinter is opening multiple GUI windows upon file selection with multiprocessing, when only one window should exist

强颜欢笑 提交于 2021-02-10 08:16:01
问题 I have primary.py: from tkinter import * from tkinter.filedialog import askopenfilename from tkinter import ttk import multiprocessing as mp import other_script class GUI: def __init__(self, master): self.master = master def file_select(): path = askopenfilename() if __name__ == '__main__': queue = mp.Queue() queue.put(path) import_ds_proc = mp.Process(target=other_script.dummy, args=(queue,)) import_ds_proc.daemon = True import_ds_proc.start() # GUI root = Tk() my_gui = GUI(root) # Display

Tkinter is opening multiple GUI windows upon file selection with multiprocessing, when only one window should exist

核能气质少年 提交于 2021-02-10 08:12:30
问题 I have primary.py: from tkinter import * from tkinter.filedialog import askopenfilename from tkinter import ttk import multiprocessing as mp import other_script class GUI: def __init__(self, master): self.master = master def file_select(): path = askopenfilename() if __name__ == '__main__': queue = mp.Queue() queue.put(path) import_ds_proc = mp.Process(target=other_script.dummy, args=(queue,)) import_ds_proc.daemon = True import_ds_proc.start() # GUI root = Tk() my_gui = GUI(root) # Display

Tkinter is opening multiple GUI windows upon file selection with multiprocessing, when only one window should exist

。_饼干妹妹 提交于 2021-02-10 08:10:01
问题 I have primary.py: from tkinter import * from tkinter.filedialog import askopenfilename from tkinter import ttk import multiprocessing as mp import other_script class GUI: def __init__(self, master): self.master = master def file_select(): path = askopenfilename() if __name__ == '__main__': queue = mp.Queue() queue.put(path) import_ds_proc = mp.Process(target=other_script.dummy, args=(queue,)) import_ds_proc.daemon = True import_ds_proc.start() # GUI root = Tk() my_gui = GUI(root) # Display

Tkinter is opening multiple GUI windows upon file selection with multiprocessing, when only one window should exist

回眸只為那壹抹淺笑 提交于 2021-02-10 08:08:27
问题 I have primary.py: from tkinter import * from tkinter.filedialog import askopenfilename from tkinter import ttk import multiprocessing as mp import other_script class GUI: def __init__(self, master): self.master = master def file_select(): path = askopenfilename() if __name__ == '__main__': queue = mp.Queue() queue.put(path) import_ds_proc = mp.Process(target=other_script.dummy, args=(queue,)) import_ds_proc.daemon = True import_ds_proc.start() # GUI root = Tk() my_gui = GUI(root) # Display

Tkinter is opening multiple GUI windows upon file selection with multiprocessing, when only one window should exist

与世无争的帅哥 提交于 2021-02-10 08:06:21
问题 I have primary.py: from tkinter import * from tkinter.filedialog import askopenfilename from tkinter import ttk import multiprocessing as mp import other_script class GUI: def __init__(self, master): self.master = master def file_select(): path = askopenfilename() if __name__ == '__main__': queue = mp.Queue() queue.put(path) import_ds_proc = mp.Process(target=other_script.dummy, args=(queue,)) import_ds_proc.daemon = True import_ds_proc.start() # GUI root = Tk() my_gui = GUI(root) # Display

How to achive true parallelism with thread in Python?

谁说我不能喝 提交于 2021-02-08 13:43:12
问题 I'm learning about threading library in Python. I don't understand, how to run two threads in parallel? Here are my python programs: Program without threading ( fibsimple.py ) def fib(n): if n < 2: return n else: return fib(n-1) + fib(n-2) fib(35) fib(35) print "Done" Running time: $ time python fibsimple.py Done real 0m7.935s user 0m7.922s sys 0m0.008s Same program with threading( fibthread.py ) from threading import Thread def fib(n): if n < 2: return n else: return fib(n-1) + fib(n-2) t1 =

Dump intermediate results of multiprocessing job to filesystem and continue with processing later on

北慕城南 提交于 2021-02-08 06:23:11
问题 I have a job that uses the multiprocessing package and calls a function via resultList = pool.map(myFunction, myListOfInputParameters) . Each entry of the list of input parameters is independent from others. This job will run a couple of hours. For safety reasons, I would like to store the results that are made in between in regular time intervals, like e.g. once an hour. How can I do this and be able to continue with the processing when the job was aborted and I want to restart it based on

Multiprocessing slower than serial processing in Windows (but not in Linux)

时光毁灭记忆、已成空白 提交于 2021-02-07 13:56:12
问题 I'm trying to parallelize a for loop to speed-up my code, since the loop processing operations are all independent. Following online tutorials, it seems the standard multiprocessing library in Python is a good start, and I've got this working for basic examples. However, for my actual use case, I find that parallel processing (using a dual core machine) is actually a little (<5%) slower, when run on Windows. Running the same code on Linux, however, results in a parallel processing speed-up of