使用字节数组时的堆碎片

发布于 2024-11-02 02:56:25 字数 379 浏览 3 评论 0原文

我有一个 C# 4.0 应用程序(单个生产者/单个消费者),它以块的形式传输大量数据。尽管没有新的内存分配,但一段时间后我就会耗尽内存。

我使用 Redgate 内存分析器分析了内存,那里有很多可用内存。它说由于碎片而无法使用可用内存。

我使用阻塞集合作为缓冲区,使用字节数组作为成员:

BlockingCollection<byte[]> segments = new BlockingCollection<byte[]>(8);
// producer:
segments.Add(buffer);
// consumer:
byte[] buffer = _segments.Take();

如何避免托管内存碎片?

I have a C# 4.0 application (single producer/single consumer) which transfers huge amount of data in chunks. Although there's no new memory allocation I run out of memory after a while.

I profiled memory using Redgate memory profiler and there are a lot of free memory there. It says free memory cannot be used because of fragmentation.

I use a blocking collection as the buffer and byte arrays as the members:

BlockingCollection<byte[]> segments = new BlockingCollection<byte[]>(8);
// producer:
segments.Add(buffer);
// consumer:
byte[] buffer = _segments.Take();

How can I avoid managed memory fragmentation?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

椵侞 2024-11-09 02:56:25

您可能遇到了大对象堆问题 - 大于 85,000 字节的对象被放置在未压缩的大对象堆上,这可能会导致奇怪的内存不足情况。尽管 .NET 4 的性能显然得到了改善,但它还远未达到完美。解决方案基本上是使用您自己的缓冲池,其中包含一些静态分配的内存块并重用它们。
SO 上有很多关于这个的问题。

更新:微软提供了缓冲区manager 作为 WCF 堆栈的一部分。还有codeproject 上的一个

You probably ran into the large object heap problem - objects larger than 85,000 bytes are put on the large object heap which is not compacted which can lead to strange out of memory situations. Although apparently the performance in .NET 4 has been improved it's far from perfect. The solution is to basically use your own buffer pool which contains a few statically allocated chunks of memory and reuse those.
There is a whole bunch of questions around that on SO.

Update: Microsoft provides a buffer manager as part of the WCF stack. There is also one on codeproject.

提笔落墨 2024-11-09 02:56:25

你的 byte[] 数组有多长?它们落入小对象堆还是大对象堆?如果你经历过记忆碎片,我会说它们陷入了 LOH。

因此,您应该重用相同的字节数组(使用池)或使用较小的块。 LOH 永远不会被压缩,因此它可能会变得相当碎片化。遗憾的是,没有办法解决这个问题。 (除了知道这个限制并避免它)

How long are your byte[] array? Do they fall into the small object or large object heap? If you experience memory fragmentation, I would say they fall into the LOH.

You should therefore reuse the same byte arrays (use a pool) or use smaller chunks. The LOH is never compacted, so it can become quite fragmented. Sadly there is no way around this. (Apart from knowing this limitation and avoiding it)

驱逐舰岛风号 2024-11-09 02:56:25

GC 不会为您压缩大型对象堆,您仍然可以通过编程方式压缩它。以下代码片段说明了如何实现这一点。

GCSettings.LargeObjectHeapCompactionMode = GCLargeObjectHeapCompactionMode.CompactOnce;
GC.Collect();

The GC doesn’t compact the large object heap for you, you can still programmatically compact it. The following code snippet illustrates how this can be achieved.

GCSettings.LargeObjectHeapCompactionMode = GCLargeObjectHeapCompactionMode.CompactOnce;
GC.Collect();
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文