python-multiprocessing

Multiprocessing: use only the physical cores?

天大地大妈咪最大 提交于 2020-03-17 03:50:26
问题 I have a function foo which consumes a lot of memory and which I would like to run several instances of in parallel. Suppose I have a CPU with 4 physical cores, each with two logical cores. My system has enough memory to accommodate 4 instances of foo in parallel but not 8. Moreover, since 4 of these 8 cores are logical ones anyway, I also do not expect using all 8 cores will provide much gains above and beyond using the 4 physical ones only. So I want to run foo on the 4 physical cores only

Pybind11 multiprocessing hangs

假如想象 提交于 2020-03-12 05:11:48
问题 I'm writing an application that use Pybind11 to embed Python interpreter (Windows, 64 bit, Visual C++ 2017). From Python, I need to spawn multiple processes, but it seems doesn't works. I try the following code as test: import multiprocessing import os import sys import time print("This is the name of the script: ", sys.argv[0]) print("Number of arguments: ", len(sys.argv)) print("The arguments are: " , str(sys.argv)) prefix=str(os.getpid())+"-" if len(sys.argv) > 1: __name__ = "__mp_main__"

Multiprocessing so slow

丶灬走出姿态 提交于 2020-03-05 09:12:11
问题 I have a function that does the following : Take a file as input and does basic cleaning. Extract the required items from the file and then write them in a pandas dataframe. The dataframe is finally converted into csv and written into a folder. This is the sample code: def extract_function(filename): with open(filename,'r') as f: input_data=f.readlines() try: // some basic searching pattern matching extracting // dataframe creation with 10 columns and then extracted values are filled in empty

Multiprocessing so slow

笑着哭i 提交于 2020-03-05 09:11:53
问题 I have a function that does the following : Take a file as input and does basic cleaning. Extract the required items from the file and then write them in a pandas dataframe. The dataframe is finally converted into csv and written into a folder. This is the sample code: def extract_function(filename): with open(filename,'r') as f: input_data=f.readlines() try: // some basic searching pattern matching extracting // dataframe creation with 10 columns and then extracted values are filled in empty

Multiple queues from one multiprocessing Manager

北慕城南 提交于 2020-03-02 21:44:01
问题 I'm writing a script that will use python's multiprocessing and threading module. For your understanding, I spawn as much processes as cores are available and inside each process I start e.g. 25 threads. Each thread consumes from an input_queue and produces to an output_queue . For the queue object I use multiprocessing.Queue . After my first tests I got a deadlock because the the thread responsible to feed and flush the Queue was hanging. After a while I found that I can use Queue().cancel

Multiple queues from one multiprocessing Manager

纵然是瞬间 提交于 2020-03-02 21:43:36
问题 I'm writing a script that will use python's multiprocessing and threading module. For your understanding, I spawn as much processes as cores are available and inside each process I start e.g. 25 threads. Each thread consumes from an input_queue and produces to an output_queue . For the queue object I use multiprocessing.Queue . After my first tests I got a deadlock because the the thread responsible to feed and flush the Queue was hanging. After a while I found that I can use Queue().cancel

Returning multiple lists from pool.map processes?

风格不统一 提交于 2020-03-02 05:56:29
问题 Win 7, x64, Python 2.7.12 In the following code I am setting off some pool processes to do a trivial multiplication via the multiprocessing.Pool.map() method. The output data is collected in List_1 . NOTE: this is a stripped down simplification of my actual code. There are multiple lists involved in the real application, all huge. import multiprocessing import numpy as np def createLists(branches): firstList = branches[:] * node return firstList def init_process(lNodes): global node node =

How to share state when using concurrent futures

点点圈 提交于 2020-03-01 18:33:27
问题 I am aware using the traditional multiprocessing library I can declare a value and share the state between processes. https://docs.python.org/3/library/multiprocessing.html?highlight=multiprocessing#sharing-state-between-processes When using the newer concurrent.futures library how can I share state between my processes? import concurrent.futures def get_user_object(batch): # do some work counter = counter + 1 print(counter) def do_multithreading(batches): with concurrent.futures

How to share state when using concurrent futures

孤街浪徒 提交于 2020-03-01 18:33:23
问题 I am aware using the traditional multiprocessing library I can declare a value and share the state between processes. https://docs.python.org/3/library/multiprocessing.html?highlight=multiprocessing#sharing-state-between-processes When using the newer concurrent.futures library how can I share state between my processes? import concurrent.futures def get_user_object(batch): # do some work counter = counter + 1 print(counter) def do_multithreading(batches): with concurrent.futures

python3 multiprocess shared numpy array(read-only)

僤鯓⒐⒋嵵緔 提交于 2020-02-25 04:11:27
问题 I'm not sure if this title is appropriate for my situation: the reason why I want to share numpy array is that it might be one of the potential solutions to my case, but if you have other solutions that would also be nice. My task: I need to implement an iterative algorithm with multiprocessing , while each of these processes need to have a copy of data(this data is large, and read-only , and won't change during the iterative algorithm). I've written some pseudo code to demonstrate my idea: