Python - 使用 cPickle 加载以前保存的 pickle 使用太多内存?

发布于 2024-09-10 16:26:24 字数 240 浏览 6 评论 0原文

Python - 使用 cPickle 加载以前保存的 pickle 使用太多内存?

我的 pickle 文件大约有 340MB,但加载后占用了 6GB 内存的 29%。这似乎有点太多了。 pickle 文件是字典的字典。这样合适吗?
使用的代码:

import cPickle as pickle

file = pickle.load( file_handle )

谢谢

Python - Using cPickle to load a previously saved pickle uses too much memory?

My pickle file is about 340MB but takes up 29% of 6gb of memory when loaded. This seems a bit too much. The pickle file is a dictionary of dictionaries. Is this appropriate?
Code used:

import cPickle as pickle

file = pickle.load( file_handle )

Thanks

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

昔日梦未散 2024-09-17 16:26:24

我总是对大pickels和sub dicts有记忆问题。因此,我最终通过 pprint 将对象写入文件,然后通过自定义模块加载器导入该文件,以将数据返回到进程范围内。工作正常并且不浪费内存。

I always had memory problems with big pickels and sub dicts. So i ended up writing my objects via pprint into files and later i import that files via a a custom module loader to get the data back in the process scope. Works fine and doesn't waste memory.

栩栩如生 2024-09-17 16:26:24

1.7GB 左右似乎有点多,但并非不可能。数据在腌制之前占用了多少内存?

取消腌制后,数据所占用的内存量应与腌制前所占用的内存量大致相同,其磁盘格式的大小实际上并不那么重要。

About 1.7GB seems a bit much, but not impossible. How much memory did the data take before it was pickled?

After unpickling the data should take about the same amount of memory as it took before it was pickled, how big it is in it's on-disk format is not really that significant.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文