减少扩展 Mathematica 会话中的内存使用量

发布于 2024-11-27 01:12:40 字数 1128 浏览 2 评论 0原文

我正在做一些相当长的计算,很容易跨越几天。在这些计算过程中,有时 Mathematica 会耗尽内存。为此,我最终采取了以下措施:

ParallelEvaluate[$KernelID]; (* Force the kernels to launch *)
kernels = Kernels[];

Do[
   If[Mod[iteration, n] == 0,
      CloseKernels[kernels];
      LaunchKernels[kernels];
      ClearSystemCache[]];
   (* Complicated stuff here *)
   Export[...], (* If a computation ends early I don't want to lose past results *)
   {iteration, min, max}]

这很好,但随着时间的推移,主内核会积累内存。目前,我的主内核消耗了大约 1.4 GB 的 RAM。有什么方法可以强制 Mathematica 清除它正在使用的内存吗?我已经尝试在代码中使用的许多 Modules 中乱扔 ShareClear ,但内存似乎仍然在累积时间。

我还尝试确保在 Module 之外没有运行任何大而复杂的东西,这样某些东西就不会在范围内停留太久。但即便如此,我的记忆力仍然有问题。

我能做些什么吗?我总是会使用大量内存,因为我的大部分计算涉及几个大而密集的矩阵(通常是 1200 x 1200,但也可以更多),所以我对使用 MemoryConstrained 持谨慎态度。


更新:

问题正是 Alexey Popkov 在他的回答中所说的。如果您使用Module,内存会随着时间的推移慢慢泄漏。在这种情况下,情况变得更加严重,因为我有多个 Module[..] 语句。 “主”Module 位于一个 ParallelTable 中,其中同时运行 8 个内核。处理(相对)大量的迭代,由于 Module 的错误,这成为大量内存泄漏的滋生地。

I'm doing some rather long computations, which can easily span a few days. In the course of these computations, sometimes Mathematica will run out of memory. To this end, I've ended up resorting to something along the lines of:

ParallelEvaluate[$KernelID]; (* Force the kernels to launch *)
kernels = Kernels[];

Do[
   If[Mod[iteration, n] == 0,
      CloseKernels[kernels];
      LaunchKernels[kernels];
      ClearSystemCache[]];
   (* Complicated stuff here *)
   Export[...], (* If a computation ends early I don't want to lose past results *)
   {iteration, min, max}]

This is great and all, but over time the main kernel accumulates memory. Currently, my main kernel is eating up roughly 1.4 GB of RAM. Is there any way I can force Mathematica to clear out the memory it's using? I've tried littering Share and Clear throughout the many Modules I'm using in my code, but the memory still seems to build up over time.

I've tried also to make sure I have nothing big and complicated running outside of a Module, so that something doesn't stay in scope too long. But even with this I still have my memory issues.

Is there anything I can do about this? I'm always going to have a large amount of memory being used, since most of my calculations involve several large and dense matrices (usually 1200 x 1200, but it can be more), so I'm wary about using MemoryConstrained.


Update:

The problem was exactly what Alexey Popkov stated in his answer. If you use Module, memory will leak slowly over time. It happened to be exacerbated in this case because I had multiple Module[..] statements. The "main" Module was within a ParallelTable where 8 kernels were running at once. Tack on the (relatively) large number of iterations, and this was a breeding ground for lots of memory leaks due to the bug with Module.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

酒解孤独 2024-12-04 01:12:40

由于您广泛使用Module,我认为您可能有兴趣了解 错误与非删除临时 Module变量。

示例(不删除未链接的临时变量及其定义):

In[1]:= $HistoryLength=0;
a[b_]:=Module[{c,d},d:=9;d/;b===1];
Length@Names[$Context<>"*"]

Out[3]= 6

In[4]:= lst=Table[a[1],{1000}];
Length@Names[$Context<>"*"]

Out[5]= 1007

In[6]:= lst=.
Length@Names[$Context<>"*"]

Out[7]= 1007

In[8]:= Definition@d$999

Out[8]= Attributes[d$999]={Temporary}

d$999:=9

请注意,在上面的代码中,我设置了 $HistoryLength = 0; 来强调 Module 的这种错误行为。如果您不这样做,临时变量仍然可以从历史变量(InOut)链接,并且不会因为更广泛的原因而与其定义一起被删除。一组案例(正如 Leonid 提到的,这不是一个错误,而是一个功能)。

更新:仅供记录。还有另一个旧错误,其中未删除未引用的 v.5.2 中 Part 分配给它们之后的 Module 变量,即使在 7.0.1 版本中也没有完全修复:

In[1]:= $HistoryLength=0;$Version
Module[{L=Array[0&,10^7]},L[[#]]++&/@Range[100];];
Names["L$*"]
ByteCount@Symbol@#&/@Names["L$*"]
Out[1]= 7.0 for Microsoft Windows (32-bit) (February 18, 2009)
Out[3]= {L$111}
Out[4]= {40000084}

Since you are using Module extensively, I think you may be interested in knowing this bug with non-deleting temporary Module variables.

Example (non-deleting unlinked temporary variables with their definitions):

In[1]:= $HistoryLength=0;
a[b_]:=Module[{c,d},d:=9;d/;b===1];
Length@Names[$Context<>"*"]

Out[3]= 6

In[4]:= lst=Table[a[1],{1000}];
Length@Names[$Context<>"*"]

Out[5]= 1007

In[6]:= lst=.
Length@Names[$Context<>"*"]

Out[7]= 1007

In[8]:= Definition@d$999

Out[8]= Attributes[d$999]={Temporary}

d$999:=9

Note that in the above code I set $HistoryLength = 0; to stress this buggy behavior of Module. If you do not do this, temporary variables can still be linked from history variables (In and Out) and will not be removed with their definitions due to this reason in more broad set of cases (it is not a bug but a feature, as Leonid mentioned).

UPDATE: Just for the record. There is another old bug with non-deleting unreferenced Module variables after Part assignments to them in v.5.2 which is not completely fixed even in version 7.0.1:

In[1]:= $HistoryLength=0;$Version
Module[{L=Array[0&,10^7]},L[[#]]++&/@Range[100];];
Names["L$*"]
ByteCount@Symbol@#&/@Names["L$*"]
Out[1]= 7.0 for Microsoft Windows (32-bit) (February 18, 2009)
Out[3]= {L$111}
Out[4]= {40000084}
悲喜皆因你 2024-12-04 01:12:40

您是否尝试过在所有子内核以及主内核中评估 $HistoryLength=0; ?历史跟踪是内存不足的最常见原因。

您是否尝试过不使用缓慢且消耗内存的 Export 而使用快速且高效的 Put 来代替?

从您的帖子中不清楚您在主内核还是子内核中评估 ClearSystemCache[] ?看来您只在主内核中评估它。在每次迭代之前也尝试在所有子内核中对其进行评估。

Have you tried to evaluate $HistoryLength=0; in all subkernels and as well as in the master kernel? History tracking is the most common source for going out of memory.

Have you tried do not use slow and memory-consuming Export and use fast and efficient Put instead?

It is not clear from your post where you evaluate ClearSystemCache[] - in the master kernel or in subkernels? It looks like you evaluate it in the master kernel only. Try to evaluate it in all subkernels too before each iteration.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文