pool

When does the pool change?

白昼怎懂夜的黑 提交于 2019-11-27 05:24:31
I have two questions: public static void main(String[] args) { String s1 = "bla"; String s2 = "b" +"l" + "a"; String s3 = "b".concat("l").concat("a"); if(s1 == s2) System.out.println("Equal"); else System.out.println("Not equal"); if(s1 == s3) System.out.println("Equal"); else System.out.println("Not equal"); } Why does s1 and s2 point to the same object, whereas s1 and s3 doesn't? ( There is no usage of new keyword ). If I get a string from the user and add to the above code these lines: BufferedReader in=new BufferedReader(new InputStreamReader(System.in)); String name=in.readLine(); if(name

Pulling data from a CMSampleBuffer in order to create a deep copy

安稳与你 提交于 2019-11-27 02:16:05
问题 I am trying to create a copy of a CMSampleBuffer as returned by captureOutput in a AVCaptureVideoDataOutputSampleBufferDelegate. Since the CMSampleBuffers come from a preallocated pool of (15) buffers, if I attach a reference to them they cannot be recollected. This causes all remaining frames to be dropped. To maintain optimal performance, some sample buffers directly reference pools of memory that may need to be reused by the device system and other capture inputs. This is frequently the

Python Multiprocess Pool. How to exit the script when one of the worker process determines no more work needs to be done?

女生的网名这么多〃 提交于 2019-11-27 02:02:40
mp.set_start_method('spawn') total_count = Counter(0) pool = mp.Pool(initializer=init, initargs=(total_count,), processes=num_proc) pool.map(part_crack_helper, product(seed_str, repeat=4)) pool.close() pool.join() So I have a pool of worker process that does some work. It just needs to find one solution. Therefore, when one of the worker processes finds the solution, I want to stop everything. One way I thought of was just calling sys.exit(). However, that doesn't seem like it's working properly since other processes are running. One other way was to check for the return value of each process

Multiprocessing.Pool makes Numpy matrix multiplication slower

若如初见. 提交于 2019-11-26 22:52:56
So, I am playing around with multiprocessing.Pool and Numpy , but it seems I missed some important point. Why is the pool version much slower? I looked at htop and I can see several processes be created, but they all share one of the CPUs adding up to ~100%. $ cat test_multi.py import numpy as np from timeit import timeit from multiprocessing import Pool def mmul(matrix): for i in range(100): matrix = matrix * matrix return matrix if __name__ == '__main__': matrices = [] for i in range(4): matrices.append(np.random.random_integers(100, size=(1000, 1000))) pool = Pool(8) print timeit(lambda:

How do you pass a Queue reference to a function managed by pool.map_async()?

独自空忆成欢 提交于 2019-11-26 22:43:25
I want a long-running process to return its progress over a Queue (or something similar) which I will feed to a progress bar dialog. I also need the result when the process is completed. A test example here fails with a RuntimeError: Queue objects should only be shared between processes through inheritance . import multiprocessing, time def task(args): count = args[0] queue = args[1] for i in xrange(count): queue.put("%d mississippi" % i) return "Done" def main(): q = multiprocessing.Queue() pool = multiprocessing.Pool() result = pool.map_async(task, [(x, q) for x in range(10)]) time.sleep(1)

How to use Python multiprocessing Pool.map to fill numpy array in a for loop

ε祈祈猫儿з 提交于 2019-11-26 21:49:17
问题 I want to fill a 2D-numpy array within a for loop and fasten the calculation by using multiprocessing. import numpy from multiprocessing import Pool array_2D = numpy.zeros((20,10)) pool = Pool(processes = 4) def fill_array(start_val): return range(start_val,start_val+10) list_start_vals = range(40,60) for line in xrange(20): array_2D[line,:] = pool.map(fill_array,list_start_vals) pool.close() print array_2D The effect of executing it is that Python runs 4 subprocesses and occupies 4 CPU cores

Can we access or query the Java String intern (constant) pool?

蹲街弑〆低调 提交于 2019-11-26 21:11:03
问题 Is there are way to access the contents of the String constant pool within our own program? Say I have some basic code that does this: String str1 = "foo"; String str2 = "bar"; There are now 2 strings floating around in our String constant pool. Is there some way to access the pool and print out the above values or get the current total number of elements currently contained in the pool? i.e. StringConstantPool pool = new StringConstantPool(); System.out.println(pool.getSize()); // etc 回答1:

Can I use a multiprocessing Queue in a function called by Pool.imap?

我怕爱的太早我们不能终老 提交于 2019-11-26 18:42:19
I'm using python 2.7, and trying to run some CPU heavy tasks in their own processes. I would like to be able to send messages back to the parent process to keep it informed of the current status of the process. The multiprocessing Queue seems perfect for this but I can't figure out how to get it work. So, this is my basic working example minus the use of a Queue. import multiprocessing as mp import time def f(x): return x*x def main(): pool = mp.Pool() results = pool.imap_unordered(f, range(1, 6)) time.sleep(1) print str(results.next()) pool.close() pool.join() if __name__ == '__main__': main(

Passing multiple parameters to pool.map() function in Python [duplicate]

本秂侑毒 提交于 2019-11-26 17:57:56
问题 This question already has an answer here: Python multiprocessing pool.map for multiple arguments 18 answers I need some way to use a function within pool.map() that accepts more than one parameter. As per my understanding, the target function of pool.map() can only have one iterable as a parameter but is there a way that I can pass other parameters in as well? In this case, I need to pass in a few configuration variables, like my Lock() and logging information to the target function. I have

How to let Pool.map take a lambda function

妖精的绣舞 提交于 2019-11-26 17:41:41
问题 I have the following function: def copy_file(source_file, target_dir): pass Now I would like to use multiprocessing to execute this function at once: p = Pool(12) p.map(lambda x: copy_file(x,target_dir), file_list) The problem is, lambda's can't be pickled, so this fails. What is the most neat (pythonic) way to fix this? 回答1: Use a function object: class Copier(object): def __init__(self, tgtdir): self.target_dir = tgtdir def __call__(self, src): copy_file(src, self.target_dir) To run your