python-multiprocessing

How can i get userinput in a thread without EOFError occuring in python?

一世执手 提交于 2021-02-20 03:51:21
问题 I am trying to receive/send data at the same time, and my idea to doing this was import multiprocessing import time from reprint import output import time import random def receiveThread(queue): while True: queue.put(random.randint(0, 50)) time.sleep(0.5) def sendThread(queue): while True: queue.put(input()) if __name__ == "__main__": send_queue = multiprocessing.Queue() receive_queue = multiprocessing.Queue() send_thread = multiprocessing.Process(target=sendThread, args=[send_queue],)

Share the list of lists in multiprocessing

。_饼干妹妹 提交于 2021-02-19 07:45:39
问题 I want to increase the efficiency of my code. One intensive part of my code is to append elements to a list of lists. Basically, I want to do something as follows, import multiprocessing import time def update_val(L, i): L.append(i**2) return L if __name__ == "__main__": N = 1000000 x_reg = [list(range(10)) for i in range(N)] y_reg = [list(range(10)) for i in range(N)] z_reg = [list(range(10)) for i in range(N)] "Regular Call" start = time.time() [x_reg[i].append(i**2) for i in range(N)] stat

multiprocessing queue issue with pickle dumps

落花浮王杯 提交于 2021-02-18 18:02:22
问题 I have read and read again the Python documentation about multiprocessing module and Queues management but I cannot find anything related to this issue that turns me crazy and is blocking my project: I wrote a 'JsonLike' class which allows me to create an object such as : a = JsonLike() a.john.doe.is.here = True ...without considering intermediate initialization (very useful) The following code just creates such an object, set and insert it in a array and tries to send that to a process (this

Python multiprocessing - Capturing signals to restart child processes or shut down parent process

断了今生、忘了曾经 提交于 2021-02-18 10:49:07
问题 I am using the multiprocessing library to spawn two child processes. I would like to ensure that as long as the parent process is alive, if the child processes die (receive a SIGKILL or SIGTERM), that they are restarted automatically. On the other hand, if the parent process receives a SIGTERM/SIGINT, I want it to terminate all child processes then exit. This is how I approached the problem: import sys import time from signal import signal, SIGINT, SIGTERM, SIGQUIT, SIGCHLD, SIG_IGN from

Two functions in parallel with multiple arguments and return values

做~自己de王妃 提交于 2021-02-17 19:42:37
问题 I've got two separate functions. Each of them takes quite a long time to execute. def function1(arg): do_some_stuff_here return result1 def function2(arg1, arg2, arg3): do_some_stuff_here return result2 I'd like to launch them in parallel, get their results (knowing which is which) and process the results afterwards. For what I've understood, multiprocessing is more efficient than Threading in Python 2.7 (GIL related issue). However I'm a bit lost whether it is better to use Process, Pool or

Generator function of child processes runs in the Parent process

感情迁移 提交于 2021-02-17 04:52:05
问题 I am trying to run a generator process in parallel by child processes. But when I tried to do this, I see the function with generator was processed by the parent process!!! from multiprocessing import Process import os import time class p(Process): def __init__(self): Process.__init__(self) def run(self): print('PID:', os.getpid()) def genfunc(self): time.sleep(1) yield os.getpid() p1 = p() p2 = p() p1.start() p2.start() print('Iterators:') print('Ran by:',next(p1.genfunc())) print('Ran by:'

Python: Multiprocessing on Windows -> Shared Readonly Memory

徘徊边缘 提交于 2021-02-11 14:44:58
问题 Is there a way to share a huge dictionary to multiprocessing Subprocesses on windows without duplicating the whole memory? I only need it read-only within the sub-processes, if that helps. My programm roughly looks like this: def workerFunc(args): id, data_mp, some_more_args = args # Do some logic # Parse some files on the disk # and access some random keys from data_mp which are only known after parsing those files on disk ... some_keys = [some_random_ids...] # Do something with do_something

Python Multiprocessing: Broken Pipe exception after increasing Pool size

北城余情 提交于 2021-02-10 17:28:42
问题 The exception I get. All I did that I increased pool count Code def parse(url): r = request.get(url) POOL_COUNT = 75 with Pool(POOL_COUNT) as p: result = p.map(parse, links) File "/usr/lib64/python3.5/multiprocessing/pool.py", line 130, in worker put((job, i, (False, wrapped))) File "/usr/lib64/python3.5/multiprocessing/queues.py", line 355, in put self._writer.send_bytes(obj) File "/usr/lib64/python3.5/multiprocessing/connection.py", line 200, in send_bytes self._send_bytes(m[offset:offset +

Python Multiprocessing: Broken Pipe exception after increasing Pool size

余生长醉 提交于 2021-02-10 17:27:10
问题 The exception I get. All I did that I increased pool count Code def parse(url): r = request.get(url) POOL_COUNT = 75 with Pool(POOL_COUNT) as p: result = p.map(parse, links) File "/usr/lib64/python3.5/multiprocessing/pool.py", line 130, in worker put((job, i, (False, wrapped))) File "/usr/lib64/python3.5/multiprocessing/queues.py", line 355, in put self._writer.send_bytes(obj) File "/usr/lib64/python3.5/multiprocessing/connection.py", line 200, in send_bytes self._send_bytes(m[offset:offset +

Parallelizing comparisons between two dataframes with multiprocessing

半世苍凉 提交于 2021-02-10 15:57:06
问题 I've got the following function that allows me to do some comparison between the rows of two dataframes ( data and ref )and return the index of both rows if there's a match. def get_gene(row): m = np.equal(row[0], ref.iloc[:,0].values) & np.greater_equal(row[2], ref.iloc[:,2].values) & np.less_equal(row[3], ref.iloc[:,3].values) return ref.index[m] if m.any() else None Being a process that takes time (25min for 1.6M rows in data versus 20K rows in ref ), I tried to speed things up by