python-multithreading

Script stuck on exit when using atexit to terminate threads

江枫思渺然 提交于 2020-02-23 04:46:06
问题 I'm playing around with threads on python 3.7.4, and I want to use atexit to register a cleanup function that will (cleanly) terminate the threads. For example: # example.py import threading import queue import atexit import sys Terminate = object() class Worker(threading.Thread): def __init__(self): super().__init__() self.queue = queue.Queue() def send_message(self, m): self.queue.put_nowait(m) def run(self): while True: m = self.queue.get() if m is Terminate: break else: print("Received

Unable to use two Threads to execute two functions within a script

女生的网名这么多〃 提交于 2020-02-07 03:40:08
问题 I've created a scraper using python in combination with Thread to make the execution faster. The scraper is supposed to parse all the links available within the webpage ended with different alphabets. It does parse them all. However, I wish to parse all the names and phone numbers from those individual links using Thread again. The first portion I could manage to run using Thread but I can't get any idea how to create another Thread to execute the latter portion of the script? I could have

Parallelize a nested for loop in python for finding the max value

二次信任 提交于 2020-02-04 05:48:45
问题 I'm struggling for some time to improve the execution time of this piece of code. Since the calculations are really time-consuming I think that the best solution would be to parallelize the code. The output could be also stored in memory, and written to a file afterwards. I am new to both Python and parallelism, so I find it difficult to apply the concepts explained here and here. I also found this question, but I couldn't manage to figure out how to implement the same for my situation. I am

Python threading print overwriting itself [duplicate]

。_饼干妹妹 提交于 2020-01-30 10:57:07
问题 This question already has answers here : Threading in python using queue (3 answers) Closed 2 years ago . I have the following which threads a print function. from threading import Thread from random import * import time def PrintRandom(): rand = random() time.sleep(rand) print(rand) if __name__ == "__main__": Thread(target=PrintRandom).start() Thread(target=PrintRandom).start() This works the majority of the time with the following being ouput 0.30041615558463897 0.5644155082254415 However,

Python threading print overwriting itself [duplicate]

时光怂恿深爱的人放手 提交于 2020-01-30 10:57:07
问题 This question already has answers here : Threading in python using queue (3 answers) Closed 2 years ago . I have the following which threads a print function. from threading import Thread from random import * import time def PrintRandom(): rand = random() time.sleep(rand) print(rand) if __name__ == "__main__": Thread(target=PrintRandom).start() Thread(target=PrintRandom).start() This works the majority of the time with the following being ouput 0.30041615558463897 0.5644155082254415 However,

aiohttp slowness with threading

China☆狼群 提交于 2020-01-24 14:11:07
问题 I copied the code from How to run an aiohttp server in a thread?. It runs fine. So I am adding one second sleep. When I launch 10 requests at the same time. The average response time is 9 seconds. Why is that? Wouldn't all requests coming back in a little bit over 1 second? import asyncio import threading from aiohttp import web import time loop = asyncio.get_event_loop() def say_hello(request): time.sleep(1) return web.Response(text='Hello, world') app = web.Application(debug=True) app.add

aiohttp slowness with threading

假装没事ソ 提交于 2020-01-24 14:08:05
问题 I copied the code from How to run an aiohttp server in a thread?. It runs fine. So I am adding one second sleep. When I launch 10 requests at the same time. The average response time is 9 seconds. Why is that? Wouldn't all requests coming back in a little bit over 1 second? import asyncio import threading from aiohttp import web import time loop = asyncio.get_event_loop() def say_hello(request): time.sleep(1) return web.Response(text='Hello, world') app = web.Application(debug=True) app.add

Show progressbar in Qt with computationally heavy background process

扶醉桌前 提交于 2020-01-23 04:02:29
问题 I'm building an application that let's the user export his/her work. This is a computationally heavy process, lasting for a minute or so, during which I want to show a progress bar (and make the rest of the UI unresponsive). I've tried the implementation below, which works fine for a non-computationally expensive background process (e.g. waiting for 0.1 s). However, for a CPU heavy process, the UI becomes very laggy and unresponsive (but not completely unresponsive). Any idea how I can solve

Why coroutines cannot be used with run_in_executor?

▼魔方 西西 提交于 2020-01-22 12:20:26
问题 I want to run a service that requests urls using coroutines and multithread. However I cannot pass coroutines to the workers in the executor. See the code below for a minimal example of this issue: import time import asyncio import concurrent.futures EXECUTOR = concurrent.futures.ThreadPoolExecutor(max_workers=5) async def async_request(loop): await asyncio.sleep(3) def sync_request(_): time.sleep(3) async def main(loop): futures = [loop.run_in_executor(EXECUTOR, async_request,loop) for x in

How to call MessageLoopWork in cefpython?

烂漫一生 提交于 2020-01-21 13:48:49
问题 I made a simple offscreen renderer with cefpython. I used cefpython.MessageLoop() but I would like to execute a javascript function with browser.GetFocusedFrame().ExecuteFunction which must be called from main UI thread. Is there a way to set a callback on cefpython's message loop? Alternatively I could use MessageLoopWork , but I don't know how. I tried to call it in a separate thread but it does not work: import threading def main_loop(): cefpython.MessageLoopWork() threading.Timer(0.01,