python-multithreading

Check if the Main Thread is still alive from another thread

…衆ロ難τιáo~ 提交于 2019-11-30 15:43:54
How can I check if the Main Thread is alive from another ( non-daemon, child ) thread? The child thread is a non-daemon thread and I'd like to check if the Main thread is still running or not, and stop this non-daemon thread based on the result. ( Making the thread daemon is not good for my situation because my thread writes to stdout which creates problems when the thread is set as a daemon) Using python 2.7 For Python 2.7 you can try this: for i in threading.enumerate(): if i.name == "MainThread": print i.is_alive() The usage of lower camel case in function names is deprecated and so you

python logging performance comparison and options

北慕城南 提交于 2019-11-30 12:50:15
问题 I am researching high performance logging in Python and so far have been disappointed by the performance of the python standard logging module - but there seem to be no alternatives. Below is a piece of code to performance test 4 different ways of logging: import logging import timeit import time import datetime from logutils.queue import QueueListener, QueueHandler import Queue import threading tmpq = Queue.Queue() def std_manual_threading(): start = datetime.datetime.now() logger = logging

Is threading.local() a safe way to store variables for a single request in Google AppEngine?

本秂侑毒 提交于 2019-11-30 09:07:42
I have a google appengine app where I want to set a global variable for that request only. Can I do this? In request_vars.py # request_vars.py global_vars = threading.local() In another.py # another.py from request_vars import global_vars get_time(): return global_vars.time_start In main.py # main.py import another from request_vars import global_vars global_vars.time_start = datetime.datetime.now() time_start = another.get_time() Questions: Considering multithreading, concurrent requests, building on Google AppEngine, and hundreds (even thousands) of requests per second, will the value of

Are there any built-in functions which block on I/O that don't allow other threads to run?

混江龙づ霸主 提交于 2019-11-30 06:56:24
I came across this interesting statement in "Caveats" section of the documentation for the thread module today: Not all built-in functions that may block waiting for I/O allow other threads to run. (The most popular ones ( time.sleep() , file.read() , select.select() ) work as expected.) Pretty much everywhere else I've ever seen Python threads discussed, there has always been an assumption that all built-in functions which do I/O will release the GIL, meaning other threads can run while the function blocks. As far as I knew, the only risk of an I/O operation blocking other threads would be if

python logging performance comparison and options

自作多情 提交于 2019-11-30 05:09:58
I am researching high performance logging in Python and so far have been disappointed by the performance of the python standard logging module - but there seem to be no alternatives. Below is a piece of code to performance test 4 different ways of logging: import logging import timeit import time import datetime from logutils.queue import QueueListener, QueueHandler import Queue import threading tmpq = Queue.Queue() def std_manual_threading(): start = datetime.datetime.now() logger = logging.getLogger() hdlr = logging.FileHandler('std_manual.out', 'w') logger.addHandler(hdlr) logger.setLevel

sharing a :memory: database between different threads in python using sqlite3 package

左心房为你撑大大i 提交于 2019-11-30 04:59:55
I would like to create a :memory: database in python and access it from different threads. Essentially something like: class T(threading.Thread): def run(self): self.conn = sqlite3.connect(':memory:') # do stuff with the database for i in xrange(N): T().start() and have all the connections referring to the same database. I am aware of passing check_same_thread=True to the connect function and sharing the connection between threads but would like to avoid doing that if possible. Thanks for any help. EDIT: corrected a typo. I originally said "have all the connections referring to the same thread

Parallelism in python isn't working right

穿精又带淫゛_ 提交于 2019-11-29 19:26:16
问题 I was developing an app on gae using python 2.7, an ajax call requests some data from an API, a single request could take ~200 ms, however when I open two browsers and make two requests at a very close time they take more than the double of that, I've tried putting everything in threads but it didn't work.. (this happens when the app is online, not just on the dev-server) So I wrote this simple test to see if this is a problem in python in general (in case of a busy wait), here is the code

Python multiprocessing: TypeError: expected string or Unicode object, NoneType found

*爱你&永不变心* 提交于 2019-11-29 16:57:07
问题 I am attempting to download a whole ftp directory in parallel. #!/usr/bin/python import sys import datetime import os from multiprocessing import Process, Pool from ftplib import FTP curYear="" remotePath ="" localPath = "" def downloadFiles (remotePath,localPath): splitted = remotePath.split('/'); host= splitted[2] path='/'+'/'.join(splitted[3:]) ftp = FTP(host) ftp.login() ftp.cwd(path) filenames = ftp.nlst() total=len(filenames) i=0 pool = Pool() for filename in filenames: local_filename =

Python IPC with matplotlib

穿精又带淫゛_ 提交于 2019-11-29 12:53:28
Project description: Connect existing "C" program (main control) to Python GUI/Widget. For this I'm using a FIFO. The C program is designed look at frame based telemetry. The Python GUI performs two functions: Runs/creates plots (probably created through matplotlib) via GUI widget as the user desires (individual .py files, scripts written by different users) Relays the frame number to the python plotting scripts after they have been created so they can "update" themselves after being given the frame number from the master program. I have several questions--understanding the pros and cons from

Understanding thread.join(timeout)

两盒软妹~` 提交于 2019-11-29 12:08:41
So the timeout param, for a thread, should stop the thread after timeout seconds (if it hasn't terminated yet). In my software I'm trying to replace a Queue.Queue.join() (it contains an item for every thread: each thread will run Queue.Queue.task_done()) that could stop the software if a thread doesn't terminate. So if a thread, among other 50, doesn't terminate then it is all freezed. I want that every thread stops in 5 seconds , for example. So i will start each thread with timeout of 5 seconds. Is it correct? CODE import threading import time def tt(name, num): while True: num += 0.5 print