python-multithreading

Key echo in Python in separate thread doesn't display first key stroke

点点圈 提交于 2019-12-07 03:20:48
问题 I would try to post a minimal working example, but unfortunately this problem just requires a lot of pieces so I have stripped it down best I can. First of all, I'm using a simple script that simulates pressing keys through a function call. This is tweaked from here. import ctypes SendInput = ctypes.windll.user32.SendInput PUL = ctypes.POINTER(ctypes.c_ulong) class KeyBdInput(ctypes.Structure): _fields_ = [("wVk", ctypes.c_ushort), ("wScan", ctypes.c_ushort), ("dwFlags", ctypes.c_ulong), (

Where is the memory leak? How to timeout threads during multiprocessing in python?

*爱你&永不变心* 提交于 2019-12-06 19:46:19
问题 It is unclear how to properly timeout workers of joblib's Parallel in python. Others have had similar questions here, here, here and here. In my example I am utilizing a pool of 50 joblib workers with threading backend. Parallel Call (threading): output = Parallel(n_jobs=50, backend = 'threading') (delayed(get_output)(INPUT) for INPUT in list) Here, Parallel hangs without errors as soon as len(list) <= n_jobs but only when n_jobs => -1 . In order to circumvent this issue, people give

How to wait for an input without blocking timer in python?

て烟熏妆下的殇ゞ 提交于 2019-12-06 14:26:48
I need to print a message every x seconds, at the same time , I need to listen to the user's input. If 'q' is pressed, it should kill the program. For example some message . . # after specified interval . some message q # program should end The current problem I face now is that raw_input is blocking which stops the my function from repeating the message. How do I get input reading and my function to run in parallel? EDIT: It turns out that raw_input was not blocking. I misunderstood how multi-threading works. I'll leave this here in case any one stumbles upon it. You can use threading to

Why is my output from a subprocess delayed, when it is generated from a Python thread?

依然范特西╮ 提交于 2019-12-06 13:56:01
问题 This an extension to my posting from yesterday, which is still unsolved: Why does my Python thread with subprocess not work as expected? In the meantime I found some interesting details, so I decided to create a new posting. To bring it to the point: There are some issues, when a subprocess is generated out of a thread. Platform: Windows 7 Enterprise, Python 3.6.1 In the following code I want to run a C-executable and get its output to stdout into a string. For test purposes the executable

Python threading issue, raw_input() blocks thread, runaway thread

两盒软妹~` 提交于 2019-12-06 07:59:38
I'm having an issue with threading in python, the issue seems to be that when I call a thread subsequently calling raw_input() blocks the thread. Here is the minimal example import threading import time class tread_test(threading.Thread): def __init__(self): self.running = True threading.Thread.__init__(self) # def run(self): self.foo() # def foo(self): print "Spam, Spam, Spam" self.task = threading.Timer(0.5, self.foo) self.task.start() # def stop(self): self.running = False self.task.cancel() # # if __name__ == "__main__": a = tread_test() print "Starting now" a.start() raw_input() a.stop()

Why can't I create a COM object in a new thread in Python?

瘦欲@ 提交于 2019-12-06 04:56:12
问题 I'm trying to create a COM Object from a dll in a new thread in Python - so I can run a message pump in that thread: from comtypes.client import CreateObject import threading class MessageThread(threading.Thread): def __init__(self): threading.Thread.__init__(self) self.daemon = True def run(self): print "Thread starting" connection = CreateObject("IDMessaging.IDMMFileConnection") print "connection created" a = CreateObject("IDMessaging.IDMMFileConnection") print "aConnection created" t =

Why can't I use operator.itemgetter in a multiprocessing.Pool?

核能气质少年 提交于 2019-12-06 00:09:33
问题 The following program: import multiprocessing,operator f = operator.itemgetter(0) # def f(*a): return operator.itemgetter(0)(*a) if __name__ == '__main__': multiprocessing.Pool(1).map(f, ["ab"]) fails with the following error: Process PoolWorker-1: Traceback (most recent call last): File "/usr/lib/python3.2/multiprocessing/process.py", line 267, in _bootstrap self.run() File "/usr/lib/python3.2/multiprocessing/process.py", line 116, in run self._target(*self._args, **self._kwargs) File "/usr

What's the point of multithreading in Python if the GIL exists?

假装没事ソ 提交于 2019-12-05 23:50:18
问题 From what I understand, the GIL makes it impossible to have threads that harness a core each individually. This is a basic question, but, what is then the point of the threading library? It seems useless if the threaded code has equivalent speed to a normal program. 回答1: In some cases an application may not utilize even one core fully and using threads (or processes) may help to do that. Think of a typical web application. It receives requests from clients, does some queries to the database

Define writable method in asyncore client makes sending data very slow

血红的双手。 提交于 2019-12-05 23:08:09
I wrote a asynchorous client using python asyncore, and met some problems. I have solved this with the help of this: Asyncore client in thread makes the whole program crash when sending data immediately But now I meet some other problem. My client program: import asyncore, threading, socket class Client(threading.Thread, asyncore.dispatcher): def __init__(self, host, port): threading.Thread.__init__(self) self.daemon = True self._thread_sockets = dict() asyncore.dispatcher.__init__(self, map=self._thread_sockets) self.host = host self.port = port self.output_buffer = [] self.start() def send

Why threading increase processing time?

谁说胖子不能爱 提交于 2019-12-05 22:00:31
I was working on multitasking a basic 2-D DLA simulation. Diffusion Limited Aggregation (DLA) is when you have particles performing a random walk and aggregate when they touch the current aggregate. In the simulation, I have 10.000 particles walking to a random direction at each step. I use a pool of worker and a queue to feed them. I feed them with a list of particles and the worker perform the method .updatePositionAndggregate() on each particle. If I have one worker, I feed it with a list of 10.000 particles, if I have two workers, i feed them with a list of 5.000 particles each, if I have