python-multiprocessing

multiprocessing.Pool in jupyter notebook works on linux but not windows

让人想犯罪 __ 提交于 2019-12-03 01:56:47
I'm trying to run a few independent computations (though reading from the same data). My code works when I run it on Ubuntu, but not on Windows (windows server 2012 R2), where I get the error: 'module' object has no attribute ... when I try to use multiprocessing.Pool (it appears in the kernel console, not as output in the notebook itself) (And I've already made the mistake of defining the function AFTER creating the pool, and I've also corrected it, that's not the problem). This happens even on the simplest of examples: from multiprocessing import Pool def f(x): return x**2 pool = Pool(4) for

When should we call multiprocessing.Pool.join?

会有一股神秘感。 提交于 2019-12-03 01:47:20
问题 I am using 'multiprocess.Pool.imap_unordered' as following from multiprocessing import Pool pool = Pool() for mapped_result in pool.imap_unordered(mapping_func, args_iter): do some additional processing on mapped_result Do I need to call pool.close or pool.join after the for loop? 回答1: No, you don't, but it's probably a good idea if you aren't going to use the pool anymore. Reasons for calling pool.close or pool.join are well said by Tim Peters in this SO post: As to Pool.close(), you should

Tkinter application 'freezes' while continually polling Pipe for contents (multiprocessing)

落爺英雄遲暮 提交于 2019-12-02 21:52:37
问题 I have two scripts: Processor_child.py : Its purpose is to perform a number of data analysis and cleaning operations. This must perform the same operations when run alone (without Tkinter_parent.py) as it does when packaged into a GUI with Tkinter_parent.py. Tkinter_parent.py : Its purpose is to provide a GUI for those who can't use Processor_child directly. Within Processor_child, there are for loops that ask the user for input on each iteration. These prompts need to appear in the Tkinter

Parallelizing four nested loops in Python

与世无争的帅哥 提交于 2019-12-02 21:16:53
I have a fairly straightforward nested for loop that iterates over four arrays: for a in a_grid: for b in b_grid: for c in c_grid: for d in d_grid: do_some_stuff(a,b,c,d) # perform calculations and write to file Maybe this isn't the most efficient way to perform calculations over a 4D grid to begin with. I know joblib is capable of parallelizing two nested for loops like this , but I'm having trouble generalizing it to four nested loops. Any ideas? I usually use code of this form: #!/usr/bin/env python3 import itertools import multiprocessing #Generate values for each parameter a = range(10) b

How to share a synchronised list of classes between processes in python multiprocessing

坚强是说给别人听的谎言 提交于 2019-12-02 16:02:56
问题 How do i share state and synchronize a list of classes when using multiprocessing. I have a crude example below where i am increasing the amount of 3 peoples wallets. The 3 people all have a different starting amounts. I iterate 10 times and give them an extra 10 each time. import concurrent.futures import multiprocessing as mp from multiprocessing import Manager from ctypes import py_object class Wallet(): def __init__(self, name, amount): self.name = name self.amount = amount def give(num):

multiprocess plotting in matplotlib

自闭症网瘾萝莉.ら 提交于 2019-12-02 12:35:10
问题 How can one visualize data using matplotlib by a function in parallel? I.e. I want to create figures in parallel processes and then display them in the main process. Here is an example: # input data import pandas as pd, matplotlib.pyplot as plt df = pd.DataFrame(data={'i':['A','A','B','B'], 'x':[1.,2.,3.,4.], 'y':[1.,2.,3.,4.]}) df.set_index('i', inplace=True) df.sort_index(inplace=True) # function which creates a figure from the data def Draw(df, i): fig = plt.figure(i) ax = fig.gca() df =

Tkinter application 'freezes' while continually polling Pipe for contents (multiprocessing)

ぃ、小莉子 提交于 2019-12-02 12:17:31
I have two scripts: Processor_child.py : Its purpose is to perform a number of data analysis and cleaning operations. This must perform the same operations when run alone (without Tkinter_parent.py) as it does when packaged into a GUI with Tkinter_parent.py. Tkinter_parent.py : Its purpose is to provide a GUI for those who can't use Processor_child directly. Within Processor_child, there are for loops that ask the user for input on each iteration. These prompts need to appear in the Tkinter app, accept the input, and send it back to Processor_child. The code below does this, raising an Entry

Missing lines when writing file with multiprocessing Lock Python

て烟熏妆下的殇ゞ 提交于 2019-12-02 11:53:33
问题 This is my code: from multiprocessing import Pool, Lock from datetime import datetime as dt console_out = "/STDOUT/Console.out" chunksize = 50 lock = Lock() def writer(message): lock.acquire() with open(console_out, 'a') as out: out.write(message) out.flush() lock.release() def conf_wrapper(state): import ProcessingModule as procs import sqlalchemy as sal stcd, nrows = state engine = sal.create_engine('postgresql://foo:bar@localhost:5432/schema') writer("State {s} started at: {n}" "\n".format

How to read serial data with multiprocessing in python?

ε祈祈猫儿з 提交于 2019-12-02 11:24:12
I have a device that outputs data at irregular intervals. I want to write data onto a csv in 2 second intervals. So I figured multiprocessing with a queue might work. Here I'm trying to just pass data from one process to another but I get Serial Exception. Also, I'm unable to run it on IDLE. So I'm stuck with using the terminal. As a result, the error message closes as soon as it opens. Here's the code: import multiprocessing import time import datetime import serial try: fio2_ser = serial.Serial("COM3", baudrate=2400, bytesize=serial.EIGHTBITS, parity =serial.PARITY_ODD) except serial

Does pydispatcher run the handler function in a background thread?

醉酒当歌 提交于 2019-12-02 10:13:08
Upon looking up event handler modules, I came across pydispatcher, which seemed beginner friendly. My use case for the library is that I want to send a signal if my queue size is over a threshold. The handler function can then start processing and removing items from the queue (and subsequently do a bulk insert into the database). I would like the handler function to run in the background. I am aware that I can simply overwrite the queue.append() method checking for the queue size and calling the handler function asynchronously, but I would like to implement the listener-dispatcher model to