concurrent.futures

Why ProcessPoolExecutor working serially?

扶醉桌前 提交于 2020-08-09 09:07:17
问题 from concurrent.futures import ProcessPoolExecutor import os import time def parInnerLoop(item): print(f'Processing {os.getpid()} started on {item}') time.sleep(3) print(f'Processing {os.getpid()} done on {item}') def main(): executor = ProcessPoolExecutor(max_workers=4) for itemNo in range(10): executor.submit(parInnerLoop(itemNo)) if __name__ == '__main__': main() What I'm trying to achieve is parallel for loop, similar to MatLab, e.g.: parfor itemNo = 0:9 parInnerLoop(itemNo); end What I'm

How to stop loop running in executor?

ⅰ亾dé卋堺 提交于 2020-07-19 11:21:05
问题 I am running function that takes time to finish. The user has a choice to stop this function/event. Is there an easy way to stop the thread or loop? class ThreadsGenerator: MAX_WORKERS = 5 def __init__(self): self._executor = ThreadPoolExecutor(max_workers=self.MAX_WORKERS) self.loop = None self.future = None def execute_function(self, function_to_execute, *args): self.loop = asyncio.get_event_loop() self.future = self.loop.run_in_executor(self._executor, function_to_execute, *args) return

Synchronous generator in asyncio

蓝咒 提交于 2020-07-09 14:34:39
问题 I have the following scenario: I have a blocking, synchronous generator I have an non-blocking, async function I would like to run blocking generator (executed in a ThreadPool ) and the async function on the event loop. How do I achieve this? The following function simply prints the output from the generator, not from sleep function. Thanks! from concurrent.futures import ThreadPoolExecutor import numpy as np import asyncio import time def f(): while True: r = np.random.randint(0, 3) time

Retrieve API data into dataframe using multi threading module

こ雲淡風輕ζ 提交于 2020-07-03 05:12:18
问题 I'm using a third-party API to retrieve 10 minute data from a large number of days for different tags. The current data pull can take up to several minutes depending of course of the number of days and number of tags. I'm therefore trying my hand at multi threading which I understand can be useful for heavy IO operations. The API call goes as follows (I've replaced the actual API name): import numpy as N import requests as r import json import pandas as pd from datetime import datetime import

Retrieve API data into dataframe using multi threading module

本小妞迷上赌 提交于 2020-07-03 05:11:19
问题 I'm using a third-party API to retrieve 10 minute data from a large number of days for different tags. The current data pull can take up to several minutes depending of course of the number of days and number of tags. I'm therefore trying my hand at multi threading which I understand can be useful for heavy IO operations. The API call goes as follows (I've replaced the actual API name): import numpy as N import requests as r import json import pandas as pd from datetime import datetime import

Multithreading inside Multiprocessing in Python

谁说胖子不能爱 提交于 2020-06-29 06:41:57
问题 I am using concurrent.futures module to do multiprocessing and multithreading. I am running it on a 8 core machine with 16GB RAM, intel i7 8th Gen processor. I tried this on Python 3.7.2 and even on Python 3.8.2 import concurrent.futures import time takes list and multiply each elem by 2 def double_value(x): y = [] for elem in x: y.append(2 *elem) return y multiply an elem by 2 def double_single_value(x): return 2* x define a import numpy as np a = np.arange(100000000).reshape(100, 1000000)

How does ThreadPoolExecutor().map differ from ThreadPoolExecutor().submit?

老子叫甜甜 提交于 2020-05-24 08:53:28
问题 I was just very confused by some code that I wrote. I was surprised to discover that: with concurrent.futures.ThreadPoolExecutor(max_workers=4) as executor: results = list(executor.map(f, iterable)) and with concurrent.futures.ThreadPoolExecutor(max_workers=4) as executor: results = list(map(lambda x: executor.submit(f, x), iterable)) produce different results. The first one produces a list of whatever type f returns, the second produces a list of concurrent.futures.Future objects that then

How to fix or reorganize this multiprocessing pattern to avoid pickling errors?

我与影子孤独终老i 提交于 2020-05-17 08:47:06
问题 Another pickling question ... The following leads to pickling errors. I think it is to do with the scoping or something. Am not sure yet. The goal is to have a decorator that takes arguments and enriches a function with methods. If the best way is to simply construct classes explicitly then that is fine but this is meant to hide things from users writing "content". import concurrent.futures import functools class A(): def __init__(self, fun, **kwargs): self.fun = fun self.stuff = kwargs

How to fix or reorganize this multiprocessing pattern to avoid pickling errors?

依然范特西╮ 提交于 2020-05-17 08:46:44
问题 Another pickling question ... The following leads to pickling errors. I think it is to do with the scoping or something. Am not sure yet. The goal is to have a decorator that takes arguments and enriches a function with methods. If the best way is to simply construct classes explicitly then that is fine but this is meant to hide things from users writing "content". import concurrent.futures import functools class A(): def __init__(self, fun, **kwargs): self.fun = fun self.stuff = kwargs

Locking in multi-threaded pywinauto send keys

大城市里の小女人 提交于 2020-02-07 02:37:25
问题 I'm new to pywinauto and I'm creating several notepad windows and typing a text in all of them. However, this is not dependent on each other, so this can be run concurrently using threads. However, when I try to do the same, the text get messed up because there are several threads trying to access the type_keys() method at the same time. Is there any way I can achieve the same concurrently? 回答1: There is another method .set_text("...") that doesn't require window to be in focus. It's