realloc 在分配小(<500Kb)数据块时一段时间后返回 NULL;有足够的内存

发布于 2024-08-30 07:04:22 字数 239 浏览 5 评论 0 原文

嗨!

简短的问题是:可能是什么问题?

我的程序的总体内存使用情况(由任务管理器显示)在其运行的所有时间(近 40 分钟)几乎是相同的,并且我还有近 2G 的可用内存。

运行在win2003r2上。

内存分配/释放足够高 - 我需要与其他软件交互,为其准备数据并在数据过时时将其删除。数据块的数量不是恒定的。

谢谢!

Hi!

The short question is: what can be the problem?

The overall memory usage of my program (shown by task manager) is almost the same all the time (near 40 minutes) it's running, and I have near 2G more free memory.

Running on win2003r2.

Memory allocation/freeing is high enough - I need to interact with other software, preparing data for it and delete it, when it's outdated. Number of data blocks is not constant.

Thank you!

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

森末i 2024-09-06 07:04:22

通常只有 2 个原因导致 realloc 失败

  1. 没有足够的连续内存来满足请求
  2. 内存损坏

即使程序中有足够的总内存来满足请求,也可能没有足够的连续内存来执行此操作由于碎片化。确定这一点的最佳方法是使用一种可以报告连续块的工具来确定是否可以满足您的请求。我相信 sysinternals 包中的工具之一可以做到这一点。

Typically there are only 2 reasons realloc will fail

  1. Not enough contiguous memory to satsify the request
  2. Memory corruption

Even though there is enough overall memory in your program to satisfy the request there may not be enough contiguous memory to do so due to fragmentation. The best way to determine this is to use a tool that can report on contiguous blocks to determine if one is available to satisfy your request. I believe one of the tools in the sysinternals package does so.

清风挽心 2024-09-06 07:04:22

由于没有代码可看,我只能给你一个解决方法。

仅在需要增长时才尝试重新分配内存,并将其大小加倍,而不是仅添加所需的字节数。这对碎片化有很大帮助。既然你说你有足够的内存,那么不用担心完成后释放它,如果足够合理就将其保留在那里。

不惜一切代价减少碎片是你的目标,对于我来说,对于当今的计算能力来说,保持 200MB 的工作集似乎完全没问题。如果您经常超过 500mb 并且您的程序运行了很长一段时间,您可以开始考虑进一步优化工作集,但在此之前不要担心。

With no code to look at, all I can give you is a workaround.

Try reallocing memory only when you need it to grow, and double it in size instead of just adding however many bytes you need. This helps tremendously with fragmentation. Since you said you have enough memory, don't worry about freeing it when you're done, just keep it there if it's reasonable enough.

Make it your goal to reduce fragmentation at any cost, keeping a 200mb working set seems perfectly fine to me for today's computing power. If you go past 500mb often and your program is ran for long periods of time, you can start looking into optimizing the working set further, but until then don't worry about it.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文