如何提高反序列化速度?

发布于 2024-08-08 09:42:39 字数 707 浏览 4 评论 0原文

使用 BinaryFormatter 进行序列化/反序列化,生成的序列化文件大小约为 80MB。反序列化需要几分钟。我该如何改进这一点?这是反序列化代码:

    public static Universe DeserializeFromFile(string filepath)
    {
        Universe universe = null;

        FileStream fs = new FileStream(filepath, FileMode.Open);

        BinaryFormatter bf = new BinaryFormatter();
        try
        {
            universe = (Universe)bf.Deserialize(fs);
        }
        catch (SerializationException e)
        {
            Console.WriteLine("Failed to deserialize. Reason: " + e.Message);
            throw;
        }
        finally
        {
            fs.Close();
        }

        return universe;
    }

也许在反序列化之前将所有内容读取到内存或使用其他序列化技术?

Serializing/deserializing with BinaryFormatter, resulting serialized file is ~80MB in size. The deserialization takes a few minutes. How could I improve on this? Here's the deserialization code:

    public static Universe DeserializeFromFile(string filepath)
    {
        Universe universe = null;

        FileStream fs = new FileStream(filepath, FileMode.Open);

        BinaryFormatter bf = new BinaryFormatter();
        try
        {
            universe = (Universe)bf.Deserialize(fs);
        }
        catch (SerializationException e)
        {
            Console.WriteLine("Failed to deserialize. Reason: " + e.Message);
            throw;
        }
        finally
        {
            fs.Close();
        }

        return universe;
    }

Maybe read all to memory prior to deserializing or use some other serialization technique?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(6

别想她 2024-08-15 09:42:39

尝试UnsafeDeserialize。据说可以提高速度。

Try UnsafeDeserialize. It is said to improve speed.

忆依然 2024-08-15 09:42:39

我知道这是一个老问题,但偶然发现了一个解决方案,可以大大提高我的反序列化速度。如果您有大量数据,这非常有用。

将目标框架升级到 4.7.1+ 并在 app.config 中启用以下开关。

<runtime>
    <!-- Use this switch to make BinaryFormatter fast with large object graphs starting with .NET 4.7.2 -->
    <AppContextSwitchOverrides value="Switch.System.Runtime.Serialization.UseNewMaxArraySize=true" />
</runtime>

资料来源:
BinaryFormatter
AppContextSwitchOverrides

I know this is an old question, but stumbled upon a solution that improved my deserialization speed substantially. This is useful if you have large sets of data.

Upgrade your target framework to 4.7.1+ and enable the following switch in your app.config.

<runtime>
    <!-- Use this switch to make BinaryFormatter fast with large object graphs starting with .NET 4.7.2 -->
    <AppContextSwitchOverrides value="Switch.System.Runtime.Serialization.UseNewMaxArraySize=true" />
</runtime>

Sources:
BinaryFormatter
AppContextSwitchOverrides

骄傲 2024-08-15 09:42:39

请查看此线程

Please take a look at this thread.

客…行舟 2024-08-15 09:42:39

尝试首先将文件一次性读入内存流,然后使用内存流进行反序列化。

Try reading the file into a memory stream first in one go, then deserialize using the memory stream.

嗫嚅 2024-08-15 09:42:39

数据有多复杂?如果它是一个对象(而不是一个完整的),那么您可能会通过尝试protobuf-net。它通常非常容易适应现有的类,并且通常更小、更快且更不易损坏(您可以更改对象模型而不会破坏数据)。

披露:我是作者,所以可能有偏见 - 但这确实并不可怕......不过,我很乐意借一些*时间来帮助您尝试。

*=在合理范围内

How complex is the data? If it is an object tree (rather than a full graph), then you might get some interesting results from trying protobuf-net. It is generally pretty easy to fit onto existing classes, and is generally much smaller, faster, and less brittle (you can change the object model without trashing the data).

Disclosure: I'm the author, so might be biased - but it really isn't terrible... I'd happily lend some* time to help you try it, though.

*=within reason

所谓喜欢 2024-08-15 09:42:39

在Universe类中实现ISerialized

Implement ISerializable in the Universe class

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文