concurrent.futures

Java Process did not quit after Future task completion

半世苍凉 提交于 2019-12-10 18:09:23
问题 This is my code snippet using Future. import java.util.concurrent.*; import java.util.*; public class FutureDemo{ public FutureDemo(){ /* Future */ ExecutorService service = Executors.newFixedThreadPool(2); for ( int i=0; i<10; i++){ MyCallable myCallable = new MyCallable((long)i); Future<Long> futureResult = service.submit(myCallable); Long result = null; try{ result = futureResult.get(5000, TimeUnit.MILLISECONDS); }catch(TimeoutException e){ System.out.println("Time out after 5 seconds");

Correctly using loop.create_future

南笙酒味 提交于 2019-12-10 17:38:18
问题 I was reading the Python documentation and PyMotW book trying to learn Async/Await, Futures, and Tasks. Coroutines and Tasks documentation: Normally there is no need to create Future objects at the application level code. From the future documentation it states the following: loop.create_future() Create an asyncio.Future object attached to the event loop. This is the preferred way to create Futures in asyncio. This lets third-party event loops provide alternative implementations of the Future

How to run generator code in parallel?

不问归期 提交于 2019-12-10 16:15:14
问题 I have code like this: def generator(): while True: # do slow calculation yield x I would like to move the slow calculation to separate process(es). I'm working in python 3.6 so I have concurrent.futures.ProcessPoolExecutor . It's just not obvious how to concurrent-ize a generator using that. The differences from a regular concurrent scenario using map is that there is nothing to map here (the generator runs forever), and we don't want all the results at once, we want to queue them up and

Any concurrent.futures timeout that actually works?

女生的网名这么多〃 提交于 2019-12-10 15:58:52
问题 Tried to write a process-based timeout (sync) on the cheap, like this: from concurrent.futures import ProcessPoolExecutor def call_with_timeout(func, *args, timeout=3): with ProcessPoolExecutor(max_workers=1) as pool: future = pool.submit(func, *args) result = future.result(timeout=timeout) But it seems the timeout argument passed to future.result doesn't really work as advertised. >>> t0 = time.time() ... call_with_timeout(time.sleep, 2, timeout=3) ... delta = time.time() - t0 ... print(

ProcessPoolExecutor logging fails to log inside function on Windows but not on Unix / Mac

霸气de小男生 提交于 2019-12-10 13:21:02
问题 When I run the following script on a Windows computer, I do not see any of the log messages from the log_pid function, however I do when I run on Unix / Mac. I've read before that multiprocessing is different on Windows compared to Mac, but it's not clear to me what changes should I make to get this script to work on Windows. I'm running Python 3.6. import logging import sys from concurrent.futures import ProcessPoolExecutor import os def log_pid(x): logger.info('Executing on process: %s' %

How to wrap custom future to use with asyncio in Python?

这一生的挚爱 提交于 2019-12-10 11:11:00
问题 There is a lot of libraries that use their custom version of Future. kafka and s3transfer are just two examples: all their custom future-like classes have object as the superclass. Not surprisingly, you cannot directly call asyncio.wrap_future() on such objects and can't use await with them. What is the proper way of wrapping such futures for use with asyncio? 回答1: If the future class supports standard future features such as done callbacks and the result method, just use something like this:

The use of “>>” in Pharo/Smalltalk

核能气质少年 提交于 2019-12-10 10:25:23
问题 I am implementing futures in Pharo. I came across this website http://onsmalltalk.com/smalltalk-concurrency-playing-with-futures. I am following this example and trying to replicate it on Pharo. However, I get to this point the last step and I have no idea what ">>" means: This symbol is not also included as part of Smalltalk syntax in http://rigaux.org/language-study/syntax-across-languages-per-language/Smalltalk.html. BlockClosure>>future ^ SFuture new value: self fixTemps I can see future

Use tqdm with concurrent.futures?

纵饮孤独 提交于 2019-12-09 14:04:02
问题 I have a multithreaded function that I would like a status bar for using tqdm . Is there an easy way to show a status bar with ThreadPoolExecutor ? It is the parallelization part that is confusing me. import concurrent.futures def f(x): return f**2 my_iter = range(1000000) def run(f,my_iter): with concurrent.futures.ThreadPoolExecutor() as executor: function = list(executor.map(f, my_iter)) return results run(f, my_iter) # wrap tqdr around this function? 回答1: You can wrap tqdm around the

Detect failed tasks in concurrent.futures

与世无争的帅哥 提交于 2019-12-08 22:02:43
问题 I've been using concurrent.futures as it has a simple interface and let user easily control the max number of threads/processes. However, it seems like concurrent.futures hides failed tasks and continue the main thread after all tasks finished/failed. import concurrent.futures def f(i): return (i + 's') with concurrent.futures.ThreadPoolExecutor(max_workers=10) as executor: fs = [executor.submit(f, i ) for i in range(10)] concurrent.futures.wait(fs) Calling f on any integer leads an TypeError

Number of max_workers when using ThreadPoolExecutor from concurrent.futures?

做~自己de王妃 提交于 2019-12-08 15:02:35
问题 What are the factors to consider when deciding what to set max_workers to in ThreadPoolExecutor from concurrent.futures? As long as you can expect Python 3.5+ to be available, is there any reason not to set max_workers to None which will then "default to the number of processors on the machine, multiplied by 5" as described in the docs here? https://docs.python.org/3/library/concurrent.futures.html#concurrent.futures.ThreadPoolExecutor 回答1: I don't think this question can be so generically