python-multiprocessing

Python multiprocessing error with class methods

梦想的初衷 提交于 2019-12-12 02:46:57
问题 I am writing a program where I have object oriented code where I am trying to do multiprocessing. I was getting pickle errors because by default python can serialize functions but not class methods. So I used suggestion on Can't pickle <type 'instancemethod'> when using python's multiprocessing Pool.map() but the problem is that if I have some lambda expressions inside my methods it's not working. My sample code is as follows: import numpy as np from copy_reg import pickle from types import

multiprocessing in python jump over process with no error

本小妞迷上赌 提交于 2019-12-12 02:16:41
问题 I have the following code which acts pretty strange. class A: def __init__(self): self.lock = Lock() self.process_list = [] self.event_list = [] def run(self): self.process_list = [] counter = 0 n = 0 while (n<1000): n += 1 print("in while again") self.lock.acquire() print('after acquired lock') self.lock.release() self.event_list.append(Event()) print('add event') p = Process(target=workerEmulator().run, args=(self.lock, self.event_list[counter])) print('create process') self.process_list

How to use Python multiprocessing to prepare images for pygame

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-12 00:45:24
问题 I'm making a slideshow app with that oh-so-naught-ies pan and zoom effect. I'm using pygame. The main display is therefore realtime 30fps+, and I don't want it stuttering when it has to load a new image - this takes it over 1/30th of a second. So I wanted to use some parallel process to prepare the images and feed the main process with these objects, which are instances of a class. I've tried with threading and with multiprocess. Threading 'works' but it's still jumpy (I blame python) - the

Python multiprocessing.pool sequential run of processes

五迷三道 提交于 2019-12-11 23:06:53
问题 I am pretty new to python and the 'multiprocessing' module in particular. However, I have managed to write a very simple script to run multiple processes (say 100) on 24 cpus. However, I have noticed that the process are not run sequentially but instead randomly. Is there a way for the processes to be run sequentially. Here is my code: #!/usr/bin/env python import multiprocessing import subprocess def prcss(cmd): sbc = subprocess.call com = sbc(cmd, shell='True') return (com) if __name__=='_

Do I need to call pool.terminate manually upon excepton in multiprocessing?

一笑奈何 提交于 2019-12-11 20:15:39
问题 It seems the following 2 snippets have the same behavior: def sqr(a): time.sleep(1.2) print 'local {}'.format(os.getpid()) if a == 20: raise Exception('fff') return a * a pool = Pool(processes=4) A: try: r = [pool.apply_async(sqr, (x,)) for x in range(100)] pool.close() for item in r: item.get(timeout=999999) except: pool.terminate() raise finally: pool.join() print 'main {}'.format(os.getpid()) B: r = [pool.apply_async(sqr, (x,)) for x in range(100)] pool.close() for item in r: item.get

PyMongo’s bulk write operation features with multiprocessing and generators

。_饼干妹妹 提交于 2019-12-11 19:57:01
问题 PyMongo supports generators for batch processing with sDB.insert(iter_something(converted)) . Bulk write operation features which executes write operations in batches in order to reduces the number of network round trips and increases write throughput. The following code seems to work, but I do not whether PyMongo still is able iterate the generator together with multiprocessing until it has yielded 1000 a documents or 16MB of data, then pause the generator while it inserts the batch into

Right way to share opencv video frame as Numpy array between multiprocessing processes

早过忘川 提交于 2019-12-11 17:33:56
问题 I want to share my captured frame in OpenVC with my multiprocessing subprocess but video_capture.read() creates a new object and doesnt write in to my numpy array that iam going to share by wrapping it with multiprocessing.Array() Here is the code: ret, frame = video_capture.read() shared_array = mp.Array(ctypes.c_uint16, frame.shape[0] * frame.shape[1], lock=False) while True: b = np.frombuffer(shared_array) ret, b = video_capture.read() But the buffer b gets overridden by the read()

Python Multiprocessing Start Process in Module Other Than Main

久未见 提交于 2019-12-11 15:54:58
问题 I have three modules, worker , master , and MainTests . I'm running the MainTests module as the main script. In MainTests , I call master.run() , inside of which I need to spawn multiple worker processes. Is this possible? In all the python multiprocessing tutorials I have come across, processes are started in the main module. If this is possible, could someone provide an example as to what this might look like? This is what I have attempted so far: Worker.py import time class Worker(object):

Multiprocessing Running Slower than a Single Process

半腔热情 提交于 2019-12-11 15:19:31
问题 I'm attempting to use multiprocessing to run many simulations across multiple processes; however, the code I have written only uses 1 of the processes as far as I can tell. Updated I've gotten all the processes to work (I think) thanks to @PaulBecotte ; however, the multiprocessing seems to run significantly slower than its non-multiprocessing counterpart. For instance, not including the function and class declarations/implementations and imports, I have: def monty_hall_sim(num_trial, player

Aggregate several AxesSubplot after multiprocessing to draw a matplotlib figure

↘锁芯ラ 提交于 2019-12-11 15:13:51
问题 I'm trying to create a subplot of several pcolormesh graphs that would be like this one: I achieve to do this by computing on one process: import matplotlib.pyplot as plt import numpy as np def create_spectrogram(data, x, y, ax): ax.pcolormesh(x, y, data) def do_it_simple(): size = 10 data = np.arange(size * size).reshape((size, size)) y = np.arange(0, 10) x = np.arange(0, 10) fig, axes = plt.subplots(nrows=2, ncols=5) axes_list = [item for sublist in axes for item in sublist] for ax in axes