64 位 .Net 应用程序中的内存限制?
在我的笔记本电脑上,运行 64 位 Windows 7 并具有 2 Gb 可用内存(根据任务管理器的报告),我能够执行以下操作:
var x = new Dictionary<Guid, decimal>( 30 * 1024 *1024 );
如果我手上没有一台具有更多 RAM 的计算机,我想知道这是否会扩展这样在具有 4 Gb 可用内存的计算机上,我将能够分配 60M 项目,而不是“仅”30M 等等?
或者在我能够消耗所有可用 RAM 之前我会遇到其他限制(.Net 和/或 Windows)?
更新: 好的,所以我不允许分配大于 2 GB 的单个对象。了解这一点很重要!但我当然很想知道我是否能够通过分配 2 Gb 块来充分利用所有内存,如下所示:
var x = new List<Dictionary<Guid, decimal>>();
for ( var i = 0 ; i < 10 ; i++ )
x.Add( new Dictionary<Guid, decimal>( 30 * 1024 *1024 ) );
如果计算机有 > 20 Gb 的可用内存,这会起作用吗?
On my laptop, running 64 bit Windows 7 and with 2 Gb of free memory (as reported by Task Manager), I'm able to do:
var x = new Dictionary<Guid, decimal>( 30 * 1024 *1024 );
Without having a computer with more RAM at my hands, I'm wondering if this will scale so that on a computer with 4 Gb free memory, I'll be able to allocate 60M items instead of "just" 30M and so on?
Or are there other limitations (of .Net and/or Windows) that I'll bump into before I'm able to consume all available RAM?
Update: OK, so I'm not allowed to allocate a single object larger than 2 Gb. That's important to know! But then I'm of course curious to know if I'll be able to fully utilize all memory by allocating 2 Gb chunks like this:
var x = new List<Dictionary<Guid, decimal>>();
for ( var i = 0 ; i < 10 ; i++ )
x.Add( new Dictionary<Guid, decimal>( 30 * 1024 *1024 ) );
Would this work if the computer have >20Gb free memory?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
.NET 中的所有对象都有 2 GiB 的限制,您永远不允许创建超过 2 GiB 的单个对象。如果您需要更大的对象,则需要确保对象是由小于 2 GiB 的部分构建的,因此您不能拥有大于 2 GiB 的连续位数组或大于 512 MiB 的单个字符串,我不是我完全确定该字符串,但我已经对此问题进行了一些测试,并且当我尝试分配大于 512 MiB 的字符串时,出现了 OutOfMemoryExceptions。
不过,这些限制会受到堆碎片的影响,即使 GC 确实尝试压缩堆,大对象(在某种程度上是超过 80K 的任意交叉)最终也会出现在大对象堆上,而该堆是未压缩的堆。严格来说,如果您可以将短期分配维持在该阈值以下,那么对于您的整体 GC 内存管理和性能来说会更好。
There's a 2 GiB limitation on all objects in .NET, you are never allowed to create a single object that exceeds 2 GiB. If you need a bigger object you need to make sure that the objects is built from parts smaller than 2 GiB, so you cannot have an array of continuous bits larger than 2 GiB or a single string longer larger than 512 MiB, I'm not entirely sure about the string but I've done some testing on the issue and was getting OutOfMemoryExceptions when I tried to allocate strings bigger than 512 MiB.
These limits though are subject to heap fragmentation and even if the GC does try to compact the heap, large objects (which is somewhat of an arbitrary cross over around 80K) end up on the large object heap which is a heap that isn't compacted. Strictly speaking, and somewhat of a side note, if you can maintain short lived allocations below this threshold it would be better for your overall GC memory management and performance.
更新:随着 .NET 4.5 的发布,64 位上的 2Gb 单对象内存限制已被取消。
您需要在 app.config 中设置
gcAllowVeryLargeObjects
。不过,数组中元素的最大数量仍然是 2^32-1。
请参阅单个对象仍限于CLR 4.0 中大小为 2 GB? 了解更多详细信息。
Update: The 2Gb single-object memory limit has been lifted on 64 bit with the release of .NET 4.5.
You'll need to set
gcAllowVeryLargeObjects
in your app.config.The maximum number of elements in an array is still 2^32-1, though.
See Single objects still limited to 2 GB in size in CLR 4.0? for more details.