克隆和大型结构处理的问题
我的 Perl 脚本有我不明白的奇怪行为。我正在处理存储为哈希数组的大型结构,该数组在处理过程中不断增长。问题是,当我将结构存储在硬盘上时,它的最大大小约为 8mb,但在处理它时,它需要大约 130mb 的 ram。为什么差别这么大?
主要处理流程如下:
while(...)
{
my %new_el = %{Storable::dclone \%some_el};
# ...
# change a few things in new_el
# ...
push @$elements_ref, \%new_el;
}
My Perl script have weird behaviour which I don't understand. I'm processing large structure stored as array of hashes which is growing while processing. The problem is that structure has about max 8mb when I store it on hdd, but while it is processing it takes about 130mb of ram. Why there is so big difference?
The main flow of proccessing looks like:
while(...)
{
my %new_el = %{Storable::dclone \%some_el};
# ...
# change a few things in new_el
# ...
push @$elements_ref, \%new_el;
}
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
您制作的数据副本超出了您的需要。尽可能尝试使用 hashrefs 而不是取消引用:
更好的是不要克隆整个哈希 - 也许您可以就地更改它?
You are making more copies of the data than you need to. Try working with hashrefs rather than dereferencing, as much as possible:
Even better would be to not clone the entire hash -- perhaps you can get away with altering it in-place?