python-multiprocessing

Python share values

帅比萌擦擦* 提交于 2019-12-23 10:55:14
问题 in my project a have multiple flags like this: file_a = False file_b = False file_c = False and I'm trying to run two processes, one (call it A from now) handles incoming messages on message queue, second (call it B from now) handles some data processing. B operates on boolean flags, and A sets those values: def a(): while True: ... ... file_a = True ... def b(): while True: ... if file_a: process(file_a) ... a_proc = Process(target=a) b_proc = Process(target=b) a_proc.start() b.proc.start()

Multiprocessing Python with RPYC “ValueError: pickling is disabled”

和自甴很熟 提交于 2019-12-23 10:47:06
问题 I am trying to use the multiprocessing package within an rpyc service, but get ValueError: pickling is disabled when I try to call the exposed function from the client. I understand that the multiprocesing package uses pickling to pass information between processes and that pickling is not allowed in rpyc because it is an insecure protocol. So I am unsure what the best way (or if there is anyway) to use multiprocessing with rpyc. How can I make use of multiprocessing within a rpyc service?

Robust way to manage and kill any process

喜夏-厌秋 提交于 2019-12-23 07:40:12
问题 I am writing code to run experiments in parallel. I don't have control over what the experiments do, they might open use subprocess.Popen or check_output to run one or multiple additional child processes. I have two conditions: I want to be able to kill experiments that exceed a time out and I want to kill experiments upon KeyboardInterrupt . Most ways to terminate processes don't make sure that all subprocesses etc are killed. This is obviously a problem if 100s of experiments are run one

Python multiprocessing pool map with multiple arguments [duplicate]

谁都会走 提交于 2019-12-23 02:59:15
问题 This question already has answers here : Python multiprocessing pool.map for multiple arguments (18 answers) Closed 2 years ago . I have a function to be called from multiprocessing pool.map with multiple arguments. from multiprocessing import Pool import time def printed(num,num2): print 'here now ' return num class A(object): def __init__(self): self.pool = Pool(8) def callme(self): print self.pool.map(printed,(1,2),(3,4)) if __name__ == '__main__': aa = A() aa.callme() but it gives me

How to limit number of CPU's used by a python script w/o terminal or multiprocessing library?

浪尽此生 提交于 2019-12-23 01:50:55
问题 My main problem is issued here. Since no one has given a solution yet, I have decided to find a workaround. I am looking for a way to limit a python scripts CPU usage ( not priority but the number of CPU cores ) with python code. I know I can do that with multiprocessing library (pool, etc.) but I am not the one who is running it with multiprocessing. So, I don't know how to that. And also I could do that via terminal but this script is being imported by another script. Unfortunately, I don't

Why can't I use pytorch to multi-cuda calculate distance? (Initialization Error for default and bad value(s) in fds_in_keep for spawn)

£可爱£侵袭症+ 提交于 2019-12-22 17:59:05
问题 I'm using pytorch not for network, but just for GPU distance matrix calculation. When I just use one GPU, everything goes perfectly; but when it goes to multi-GPU, Error occurs. Firstly, I got RuntimeError: CUDA error: initialization error , and I googled this Error and found a solution: add mp.set_start_method('spawn') . But new error occurred, this time the error was ValueError: bad value(s) in fds_to_keep , and I've not found a way to solve this error. Now I'm confused and don't know how

arcpy + multiprocessing: error: could not save Raster dataset

耗尽温柔 提交于 2019-12-22 11:28:01
问题 I'm doing some number crunching on a raster with arcpy and want to use the multiprocessing package to speed things up. Basically, I need to loop through a list of tuples, do some raster calculations using each tuple and write some outputs to files. My inputs consist of a data raster (bathymetry), a raster that defines zones, and a tuple of two floats (water surface elevation, depth). My procedure consists of a function computeplane which takes a tuple and runs a series of raster calculations

How to use boto3 client with Python multiprocessing?

主宰稳场 提交于 2019-12-22 09:44:55
问题 Code looks something like this: import multiprocessing as mp from functools import partial import boto3 import numpy as np s3 = boto3.client('s3') def _something(**kwargs): # Some mixed integer programming stuff related to the variable archive return np.array(some_variable_related_to_archive) def do(s3): archive = np.load(s3.get_object('some_key')) # Simplified -- details not relevant pool = mp.pool() sub_process = partial(_something, slack=0.1) parts = np.array_split(archive, some_int)

recursion max error when when using futures.ProcessPoolExecutor but not futures.ThreadPoolExecutor with PRAW wrapper

微笑、不失礼 提交于 2019-12-22 09:17:14
问题 I am using this code to scrape an API: submissions = get_submissions(1) with futures.ProcessPoolExecutor(max_workers=4) as executor: #or using this: with futures.ThreadPoolExecutor(max_workers=4) as executor: for s in executor.map(map_func, submissions): collection_front.update({"time_recorded":time_recorded}, {'$push':{"thread_list":s}}, upsert=True) It works great/fast with threads but when I try to use processes I get a full queue and this error: File "/usr/local/lib/python3.4/dist

Why does pool run the entire file multiple times?

我的未来我决定 提交于 2019-12-22 08:54:41
问题 I'm trying to understand the output from this Python 2.7.5 example script: import time from multiprocessing import Pool print(time.strftime('%Y-%m-%d %H:%M', time.localtime(time.time()))) props2=[ '170339', '170357', '170345', '170346', '171232', '170363', ] def go(x): print(x) if __name__ == '__main__': pool = Pool(processes=3) pool.map(go, props2) print(time.strftime('%Y-%m-%d %H:%M', time.localtime(time.time()))) This yields the output: 2015-08-06 10:13 2015-08-06 10:13 2015-08-06 10:13