CLR 4.0 中单个对象的大小仍然限制为 2 GB?
据我了解,.NET 中的单个实例有 2 GB 的限制。 由于到目前为止我主要在 32 位操作系统上工作,所以我并没有对此给予太多关注。 32 但无论如何这或多或少是一个人为的限制。 然而,我很惊讶地发现 此限制也适用于64 位 .NET。
由于 List
等集合使用数组来存储项目,这意味着在 32 位上运行的 .NET 应用程序将能够在列表中保存两倍数量的引用类型项目。相同的应用程序在 64 位上运行。 在我看来,这真是令人惊讶。
有谁知道 CLR 4.0 中是否解决了此限制(我手头目前没有 4.0 安装)。
As I understand it there's a 2 GB limit on single instances in .NET. I haven't paid a lot of attention to that since I have mainly worked on 32 bit OS so far. On 32 but it is more or less an artificial limitation anyway. However, I was quite surprised to learn that this limitation also applies on 64 bit .NET.
Since collections such as List<T>
use an array to store items, that means that a .NET application running on 32 bit will be able to hold twice as many reference type items in a list compared to the same application running on 64 bit. That is quite surprising imo.
Does anyone know if this limitation is addressed in CLR 4.0 (I don't have a 4.0 installation at hand at the moment).
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
比这更糟糕的是,当您在 32 位的 .NET 中工作时,进程空间比理论限制小得多。 在 32 位 .NET 应用程序中,我的经验是,在内存使用量约为 1.2-1.4GB 时,您总是会开始出现内存不足错误(有些人说他们可以达到 1.6……但我从未见过这种情况) )。 当然,这在 64 位系统上不是问题。
话虽如此,即使在 64 位系统上,单个 2GB 的引用类型数组也是一个巨大的对象数量。 即使使用 8 字节引用,您也可以分配 268,435,456 个对象引用的数组 - 每个引用都可以非常大(最多 2GB,如果使用嵌套对象则更大)。 这比大多数应用程序实际需要的内存还要多。
CLR 团队的一位成员在博客中介绍了这一点 ,并提供一些解决这些限制的方法选项。 在 64 位系统上,执行类似 BigArray的操作 将是一个将任意数量的对象分配到数组中的可行解决方案 - 远远超过 2GB 单个对象的限制。 P/Invoke 还允许您分配更大的数组。
编辑:我也应该提到这一点 - 我认为 .NET 4 的这种行为根本没有改变。自 .NET 诞生以来,这种行为一直没有改变。
编辑:.NET 4.5 现在可以在 x64 中选择通过设置 gcAllowVeryLargeObjects 在 app.config 中。
It's worse than that - you're process space, when you're working in .NET in 32bit is much smaller than the theoretical limit. In 32bit .NET apps, my experience is that you'll always tend to start getting out of memory errors somewhere around 1.2-1.4gb of memory usage (some people say they can get to 1.6... but I've never seen that). Of course, this isn't a problem on 64bit systems.
That being said, a single 2GB array of reference types, even on 64bit systems, is a huge amount of objects. Even with 8 byte references, you have the ability to allocate an array of 268,435,456 object references - each of which can be very large (up to 2GB, more if they're using nested objects). That's more memory than would ever really be required by most applications.
One of the members of the CLR team blogged about this, with some options for ways to work around these limitations. On a 64bit system, doing something like his BigArray<T> would be a viable solution to allocate any number of objects into an array - much more than the 2gb single object limit. P/Invoke can allow you to allocate larger arrays as well.
Edit: I should have mentioned this, as well - I do not believe this behavior has changed at all for .NET 4. The behavior has been unchanged since the beginning of .NET.
Edit: .NET 4.5 will now have the option in x64 to explicitly allow objects to be larger than 2gb by setting gcAllowVeryLargeObjects in the app.config.
.NET Framework 4.5 允许在 64 位平台上创建大于 2GB 的数组。 默认情况下,此功能未启用,必须使用 gcAllowVeryLargeObjects 元素通过配置文件启用。
http://msdn.microsoft.com/en-我们/库/hh285054(v=vs.110).aspx
.NET Framework 4.5 allows creating arrays larger than 2GB on 64 bit platforms. This feature is not on by default, it has to be enabled via config file using the gcAllowVeryLargeObjects element.
http://msdn.microsoft.com/en-us/library/hh285054(v=vs.110).aspx
这在数值领域是一件大事。 任何在 .NET 中使用数值类库的人都将其矩阵存储为数组。 这样就可以调用本机库来进行数字运算。 2GB 的限制严重限制了 64 位 .NET 中矩阵的大小。 更多信息请此处。
This is a big deal in the numerical field. Anyone using numerical class libraries in .NET has their matrices stored as arrays underneath. This is so native libraries can be called to do the number-crunching. The 2GB limit seriously hampers the size of matrices possible in 64-bit .NET. More here.