Python 中的多线程 Pickling
我有一个带有多个线程的 python 程序。每个线程都会检测事件,我想将其存储在某个地方,以便我可以再次读取它们(用于测试)。现在,我使用 Pickle 输出事件,每个线程输出到不同的文件。理想情况下,我只会使用一个输出文件,所有线程都会写入该文件,但是当我尝试这样做时,看起来各个线程尝试同时写入其输出,并且它们没有被正确腌制。有办法做到这一点吗?
I have a python program with multiple threads. Each thread detects events, which I would like to store somewhere so that I can read them in again (for testing). Right now, I'm using Pickle to output the events, and each thread outputs to a different file. Ideally, I would only use one output file, and all the threads would write to it, but when I try this, it looks like the various threads try to write their output at the same time, and they don't get pickled properly. Is there a way to do this?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
发布评论
评论(5)
是的,使用 threading.Lock() 对象。
您在创建所有线程之前创建一个锁,将其交给负责保存/腌制项目的方法,并且该方法应该在写入文件之前获取锁并在之后释放它。
以下是使用 threading.Lock() 的示例:
import threading
import pickle
picke_lock = threading.Lock()
def do(s):
picke_lock.acquire()
try:
ps = pickle.dumps(s)
finally:
picke_lock.release()
return ps
t1 = threading.Thread(target=do, args =("foo",))
t2 = threading.Thread(target=do, args =("bar",))
p1 = t1.start()
p2 = t2.start()
inpt = raw_input('type anything and click enter... ')
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
似乎是使用
Queue
的好地方。来自队列文档:
seems like a good place to use a
Queue
.from the Queue docs: