在 C# 中使用映射内存文件来存储引用类型

发布于 2024-08-30 06:59:35 字数 194 浏览 8 评论 0原文

我需要尽快将字典存储到文件中。键和值都是对象,不保证被标记为可序列化。另外,我更喜欢一种比序列化数千个对象更快的方法。因此,我研究了 .NET 4 中的映射内存文件支持。但是,MemoryMappedViewAccessor 似乎只允许存储结构,而不允许存储引用类型。

有没有一种方法可以存储文件的引用类型使用的内存并从该内存块重建对象(无需二进制序列化)?

I need to store a dictionary to a file as fast as possible. Both key and value are objects and not guaranteed to be marked as Serializable. Also I prefer a method faster than serializing thousands of objects. So I looked into Mapped Memory Files support in .NET 4. However, it seems MemoryMappedViewAccessor only allows storage of structs and not reference types.

Is there a way of storing the memory used by a reference type of a file and reconstructing the object from that blob of memory (without binary serialization)?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

茶色山野 2024-09-06 06:59:35

内存映射文件从根本上与垃圾收集器不兼容。这就是为什么这样一个主要的操作系统功能花了这么长时间才得到 .NET 的支持。引用类型需要序列化到 MMF 视图 MemoryMappedViewStream,没有办法解决这个问题。非托管代码中也存在类似的限制,带有指针的对象需要展平,以便所指向的对象在视图中也可见。

无论将它们序列化为 MMF 还是文件都没有任何区别,文件系统缓存也是使用 MMF 实现的。只要写入的数据适合可用的可映射内存,文件写入就会非常快。如果这是一个问题,请考虑使用 64 位操作系统来解决该问题。

Memory mapped files are fundamentally incompatible with the garbage collector. Which is why it took so long for such a principal operating system feature to get supported by .NET. Reference types need to be serialized to the MMF view, MemoryMappedViewStream, no way around that. A similar restriction exists in unmanaged code, objects with pointers need to be flattened so the pointed-to objects are visible in the view as well.

Whether you serialize them to a MMF or to a file won't make any difference, the file system cache is implemented with MMFs as well. File writes are very fast, as long as the written data fits in available mappable memory. If that's an issue then look at a 64-bit operating system to solve that problem.

囍孤女 2024-09-06 06:59:35

我认为存储一块内存是根本行不通的,因为如果该内存具有引用类型,则该内存将具有指向其他内存块的指针,而这些内存块在下次访问该文件时可能不会应用。这就是二进制序列化存在的原因:维护这些类型的引用。不过,如果您确实想要严格控制,我会使用 System.IO.BinaryWriter 和 BinaryReader 来完全控制以什么顺序写入文件的内容,同时最大限度地减少开销。

I believe that storing a blob of memory is simply unworkable because that memory, if it has reference types, will have pointers to other blocks of memory that will likely not apply next time the file is accessed. That's why binary serialization exists: to maintain these kinds of references. If you really want tight control, though, I would use System.IO.BinaryWriter and BinaryReader to have full control over exactly what is written to the file in what sequence, while minimizing overhead.

柠檬 2024-09-06 06:59:35

这就是二进制序列化设计的场景类型。您不想使用它有什么具体原因吗?您是否已确认它“太慢”?当然,您可以编写自己的自定义序列化器,并可能使其对于您的特定场景更加有效,但随后您必须继续维护它。值得付出努力吗?

This is the type of scenario that binary serialization was designed for. Is there some specific reason why you don't want to use that? Have you verified that it's 'too slow'? Sure, you can code your own custom serializer and probably make it more efficient for your specific scenario, but then you'll have to maintain it going forward. Is it worth the effort?

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文