问题
I have a python program with multiple threads. Each thread detects events, which I would like to store somewhere so that I can read them in again (for testing). Right now, I'm using Pickle to output the events, and each thread outputs to a different file. Ideally, I would only use one output file, and all the threads would write to it, but when I try this, it looks like the various threads try to write their output at the same time, and they don't get pickled properly. Is there a way to do this?
回答1:
seems like a good place to use a Queue.
- Have all your event detection threads put items on a shared Queue.
- Create another thread to get items from the queue, and write/pickle/whatever from this thread.
from the Queue docs:
"The Queue module implements multi-producer, multi-consumer queues. It is especially useful in threaded programming when information must be exchanged safely between multiple threads. The Queue class in this module implements all the required locking semantics. It depends on the availability of thread support in Python; see the threading module."
回答2:
Yes, with threading.Lock() objects. You create a lock before creating all your threads, you give it to the method that is responsible of saving/pickling items, and this method should acquire the lock before writing into the file and releasing it after.
回答3:
You could create a lock and acquire/release it around every call to pickle.dump().
回答4:
The logging module has a Rlock built into its Handlers. So you could logging as normal (just create a handler to log to a file.)
回答5:
Here is an example using threading.Lock():
import threading
import pickle
picke_lock = threading.Lock()
def do(s):
picke_lock.acquire()
try:
ps = pickle.dumps(s)
finally:
picke_lock.release()
return ps
t1 = threading.Thread(target=do, args =("foo",))
t2 = threading.Thread(target=do, args =("bar",))
p1 = t1.start()
p2 = t2.start()
inpt = raw_input('type anything and click enter... ')
来源:https://stackoverflow.com/questions/7514958/pickling-from-multiple-threads-in-python