concurrent.futures

How to use Python Concurrent Futures with decorators [closed]

孤人 提交于 2019-12-07 23:29:28
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed last month . I'm using a decorator for the thread pool executor: from functools import wraps from .bounded_pool_executor import BoundedThreadPoolExecutor _DEFAULT_POOL = BoundedThreadPoolExecutor(max_workers=5) def threadpool(f, executor=None): @wraps(f) def wrap(*args, **kwargs): return (executor or _DEFAULT_POOL).submit(f,

asyncio yield from concurrent.futures.Future of an Executor

不想你离开。 提交于 2019-12-06 19:27:27
问题 I have a long_task function which runs a heavy cpu-bound calculation and I want to make it asynchronous by using the new asyncio framework. The resulting long_task_async function uses a ProcessPoolExecutor to offload work to a different process to not be constrained by the GIL. The trouble is that for some reason the concurrent.futures.Future instance returned from ProcessPoolExecutor.submit when yielded from throws a TypeError . Is this by design? Are those futures not compatible with

How can I cancel a hanging asyncronous task in tornado, with a timeout?

浪尽此生 提交于 2019-12-06 09:13:07
My setup is python tornado server, which asynchronously processes tasks with a ThreadPoolExecutor . In some conditions, the task might turn into infinite loop. With the with_timeout decorator, I have managed to catch the timeout exception and return an error result to the client. The problem is that the task is still running in the background. How it is possible to stop the task from running in the ThreadPoolExecutor ? Or is it possible to cancel the Future ? Here is the code that reproduces the problem. Run the code with tornado 4 and concurrent.futures libraries and go to http://localhost

How to print results of Python ThreadPoolExecutor.map immediately?

…衆ロ難τιáo~ 提交于 2019-12-06 06:07:25
I am running a function for several sets of iterables, returning a list of all results as soon as all processes are finished. def fct(variable1, variable2): # do an operation that does not necessarily take the same amount of # time for different input variables and yields result1 and result2 return result1, result2 variables1 = [1,2,3,4] variables2 = [7,8,9,0] with ThreadPoolExecutor(max_workers = 8) as executor: future = executor.map(fct,variables1,variables2) print '[%s]' % ', '.join(map(str, future)) >>> [ (12,3) , (13,4) , (14,5) , (15,6) ] How can I print intermediary results e.g. for

The use of “>>” in Pharo/Smalltalk

為{幸葍}努か 提交于 2019-12-06 01:22:16
I am implementing futures in Pharo. I came across this website http://onsmalltalk.com/smalltalk-concurrency-playing-with-futures . I am following this example and trying to replicate it on Pharo. However, I get to this point the last step and I have no idea what ">>" means: This symbol is not also included as part of Smalltalk syntax in http://rigaux.org/language-study/syntax-across-languages-per-language/Smalltalk.html . BlockClosure>>future ^ SFuture new value: self fixTemps I can see future is not a variable or a method implemented by BlockClosure. What should I do with this part of the

Python concurrent.futures using subprocess with a callback

故事扮演 提交于 2019-12-05 23:18:26
I am executing a FORTRAN exe from Python. The FORTRAN exe takes many minutes to complete; therefore, I need a callback to be fired when the exe finishes. The exe does not return anything back to Python, but in the callback function I will use Python to parse output text files from the FORTRAN. For this I am using concurrent.futures and add_done_callback() , and it works. But this part of a web service and I need to have the Python method that calls subprocess.call() / Popen() to return once the FORTRAN exe is executed. Then when the FORTRAN is complete the callback function is called. def

Function that multiprocesses another function

吃可爱长大的小学妹 提交于 2019-12-05 18:06:46
I'm performing analyses of time-series of simulations. Basically, it's doing the same tasks for every time steps. As there is a very high number of time steps, and as the analyze of each of them is independant, I wanted to create a function that can multiprocess another function. The latter will have arguments, and return a result. Using a shared dictionnary and the lib concurrent.futures, I managed to write this : import concurrent.futures as Cfut def multiprocess_loop_grouped(function, param_list, group_size, Nworkers, *args): # function : function that is running in parallel # param_list :

Why is asyncio.Future incompatible with concurrent.futures.Future?

无人久伴 提交于 2019-12-05 02:49:02
The two classes represent excellent abstractions for concurrent programming, so it's a bit disconcerting that they don't support the same API. Specifically, according to the docs : asyncio.Future is almost compatible with concurrent.futures.Future . Differences: result() and exception() do not take a timeout argument and raise an exception when the future isn’t done yet. Callbacks registered with add_done_callback() are always called via the event loop's call_soon_threadsafe() . This class is not compatible with the wait() and as_completed() functions in the concurrent.futures package. The

How to chain futures in a non-blocking manner? That is, how to use one future as an input in another future without blocking?

巧了我就是萌 提交于 2019-12-05 02:04:57
Using the below example, how can future2 use the result of future1 once future1 is complete (without blocking future3 from being submitted)? from concurrent.futures import ProcessPoolExecutor import time def wait(seconds): time.sleep(seconds) return seconds pool = ProcessPoolExecutor() s = time.time() future1 = pool.submit(wait, 5) future2 = pool.submit(wait, future1.result()) future3 = pool.submit(wait, 10) time_taken = time.time() - s print(time_taken) This is achievable by carefully crafting a callback to submit the second operation after the first one has completed. Sadly, it is not

Checking up on a `concurrent.futures.ThreadPoolExecutor`

▼魔方 西西 提交于 2019-12-05 01:27:28
I've got a live concurrent.futures.ThreadPoolExecutor . I want to check its status. I want to know how many threads there are, how many are handling tasks and which tasks, how many are free, and which tasks are in the queue. How can I find out these things? There is some visibility into the Pool, and the pending workitem queue. To find out what's available, print poolx.__dict__ to see the structure. Read the ThreadPool code, it's pretty good: concurrent.futures.thread The following creates a pool with one thread. It then creates two jobs: one sleeps for 3 seconds, the other immediately returns