python-multithreading

Why is my output from a subprocess delayed, when it is generated from a Python thread?

戏子无情 提交于 2019-12-04 19:06:45
This an extension to my posting from yesterday, which is still unsolved: Why does my Python thread with subprocess not work as expected? In the meantime I found some interesting details, so I decided to create a new posting. To bring it to the point: There are some issues, when a subprocess is generated out of a thread. Platform: Windows 7 Enterprise, Python 3.6.1 In the following code I want to run a C-executable and get its output to stdout into a string. For test purposes the executable accepts two parameters: a delay and a filename (not used here). The program writes Sleep now to stdout ,

Threading and information passing — how to

∥☆過路亽.° 提交于 2019-12-04 08:05:27
To reframe from confusion i have edited the question: one.py import threading count = 5 dev = threading.Thread(name='dev', target=dev,args=(workQueue,count,)) dev.setDaemon(True) dev.start() workQueue = Queue.Queue(10) queueLock.acquire() workQueue.put(word) queueLock.release() count = 3 time.sleep(2) count = 5 but my confusion here is I am able to put and get values from queue between threads but in case of count it does not reflect. Why is that? What is point am actually missing here? class dev ( threading.Thread ): def test(self): while 1: print count print self.EPP_Obj queueLock.acquire()

Parallelize this nested for loop in python

狂风中的少年 提交于 2019-12-04 07:56:14
I'm struggling again to improve the execution time of this piece of code. Since the calculations are really time-consuming I think that the best solution would be to parallelize the code. I was first working with maps as explained in this question, but then I tried a more simple approach thinking that I could find a better solution. However I couldn't come up with anything yet, so since it's a different problem I decided to post it as a new question. I am working on a Windows platform, using Python 3.4. Here's the code: similarity_matrix = [[0 for x in range(word_count)] for x in range(word

Python multi-threading

二次信任 提交于 2019-12-04 06:23:58
问题 For display, let's say, 2 plotting figures (with matplotlib) and 1 message box (with wxpython) at the same time (in no sequential manner) a good idea is to use threads : from reportlab.pdfgen import canvas from reportlab.lib.units import mm from reportlab.platypus import Flowable from mpl_toolkits.axes_grid1 import host_subplot from threading import Thread import numpy as np import matplotlib.pyplot as plt import wx def plFig1(): fig1 = plt.figure(1) fig1.canvas.set_window_title('Quality

is there any pool for ThreadingMixIn and ForkingMixIn for SocketServer?

陌路散爱 提交于 2019-12-04 06:01:18
I was trying to make an http proxy using BaseHttpServer which is based on SocketServer which got 2 asynchronous Mixins (ThreadingMixIn and ForkingMixIn) the problem with those two that they work on each request (allocate a new thread or fork a new subprocess for each request) is there a Mixin that utilize a pool of let's say 4 subprocesses and 40 threads in each so requests get handled by those already created threads ? because this would be a big performance gain and I guess it would save some resources. You could use a pool from concurrent.futures (in stdlib since Python 3.2): from

Does pydispatcher run the handler function in a background thread?

空扰寡人 提交于 2019-12-04 06:00:42
问题 Upon looking up event handler modules, I came across pydispatcher, which seemed beginner friendly. My use case for the library is that I want to send a signal if my queue size is over a threshold. The handler function can then start processing and removing items from the queue (and subsequently do a bulk insert into the database). I would like the handler function to run in the background. I am aware that I can simply overwrite the queue.append() method checking for the queue size and calling

Why can't I use operator.itemgetter in a multiprocessing.Pool?

末鹿安然 提交于 2019-12-04 05:12:12
The following program: import multiprocessing,operator f = operator.itemgetter(0) # def f(*a): return operator.itemgetter(0)(*a) if __name__ == '__main__': multiprocessing.Pool(1).map(f, ["ab"]) fails with the following error: Process PoolWorker-1: Traceback (most recent call last): File "/usr/lib/python3.2/multiprocessing/process.py", line 267, in _bootstrap self.run() File "/usr/lib/python3.2/multiprocessing/process.py", line 116, in run self._target(*self._args, **self._kwargs) File "/usr/lib/python3.2/multiprocessing/pool.py", line 102, in worker task = get() File "/usr/lib/python3.2

asyncio: Wait for event from other thread

陌路散爱 提交于 2019-12-04 03:54:27
I'm designing an application in Python which should access a machine to perform some (lengthy) tasks. The asyncio module seems to be a good choice for everything that is network-related, but now I need to access the serial port for one specific component. I've implemented kind of an abstraction layer for the actual serial port stuff, but can't figure out how to sensibly integrate this with asyncio. Following setup: I have a thread running a loop, which regularly talks to the machine and decodes the responses. Using a method enqueue_query() , I can put a query string into a queue, which will

Using Multithreaded queue in python the correct way?

落花浮王杯 提交于 2019-12-04 00:49:35
I am trying to use The Queue in python which will be multithreaded. I just wanted to know the approach I am using is correct or not. And if I am doing something redundant or If there is a better approach that I should use. I am trying to get new requests from a table and schedule them using some logic to perform some operation like running a query. So here from the main thread I spawn a separate thread for the queue. if __name__=='__main__': request_queue = SetQueue(maxsize=-1) worker = Thread(target=request_queue.process_queue) worker.setDaemon(True) worker.start() while True: try: #Connect

threading.Event().wait() is not notified after calling event.set()

我是研究僧i 提交于 2019-12-03 22:28:23
I came across a very strange issue when using threading.Event() and couldn't understand what is going on? I must have missed something, please can you point it out? I have a Listener class which shares the same event object with signal handler, here is my simplified code: import threading, time class Listener(object): def __init__(self, event): super(Listener, self).__init__() self.event = event def start(self): while not self.event.is_set(): print("Listener started, waiting for messages ...") self.event.wait() print("Listener is terminated ...") self.event.clear() event = threading.Event()