克隆和大型结构处理的问题

发布于 2024-08-28 03:29:49 字数 325 浏览 6 评论 0原文

我的 Perl 脚本有我不明白的奇怪行为。我正在处理存储为哈希数组的大型结构,该数组在处理过程中不断增长。问题是,当我将结构存储在硬盘上时,它的最大大小约为 8mb,但在处理它时,它需要大约 130mb 的 ram。为什么差别这么大?

主要处理流程如下:

while(...)
{
    my %new_el = %{Storable::dclone \%some_el};

    # ...
    # change a few things in new_el
    # ...

    push @$elements_ref, \%new_el; 
}

My Perl script have weird behaviour which I don't understand. I'm processing large structure stored as array of hashes which is growing while processing. The problem is that structure has about max 8mb when I store it on hdd, but while it is processing it takes about 130mb of ram. Why there is so big difference?

The main flow of proccessing looks like:

while(...)
{
    my %new_el = %{Storable::dclone \%some_el};

    # ...
    # change a few things in new_el
    # ...

    push @$elements_ref, \%new_el; 
}

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

〃安静 2024-09-04 03:29:49

您制作的数据副本超出了您的需要。尽可能尝试使用 hashrefs 而不是取消引用:

while (...)
{
    my $new_el = Storable::dclone \%some_el;

    # ...
    # change a few things in new_el
    # ...

    push @$elements_ref, $new_el; 
}

更好的是不要克隆整个哈希 - 也许您可以就地更改它?

You are making more copies of the data than you need to. Try working with hashrefs rather than dereferencing, as much as possible:

while (...)
{
    my $new_el = Storable::dclone \%some_el;

    # ...
    # change a few things in new_el
    # ...

    push @$elements_ref, $new_el; 
}

Even better would be to not clone the entire hash -- perhaps you can get away with altering it in-place?

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文