FileStream读/写方法的限制

发布于 2024-11-01 10:54:49 字数 160 浏览 5 评论 0原文

FileStream 的读/写方法只能采用整数值作为长度。但是FileStream对象返回long中的长度。在这种情况下,如果文件大小大于整数值(大约超过 2GB)怎么办?那么FileStream的读/写方法如何处理long值。

FileStream's read/write method can take only integer value as length. But FileStreamobject returns length in long. In this case, what if file size is larger than integer value (approximate more than 2GB). Then how FileStream's read/write method handle long value.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

拿命拼未来 2024-11-08 10:54:49

然后,您可以分多个块进行读取和写入。无论如何,CLR 对任何特定对象的大小都有限制(即使在 64 位 CLR 上,也大约是 2GB IIRC),因此您不能拥有足够大的字节数组以使其成为问题。

无论如何,您应该在读取时始终循环,因为您无法保证 Read 调用将读取您请求的字节数,即使有更多数据即将到来。

编辑:分块阅读:

byte[] buffer = new byte[1024 * 32];
int bytesRead;

while ((bytesRead = stream.Read(buffer, 0, buffer.Length)) > 0)
{
    // Use the data you've read
}

分块写作将取决于你正在写的内容......很难抽象地谈论它。

Then you read and write in multiple chunks. The CLR has a limit on the size of any particular object anyway (also around 2GB IIRC, even on a 64-bit CLR), so you couldn't have a byte array big enough for it to be a problem.

You should always loop when reading anyway, as you can't guarantee that a Read call will read as many bytes as you requested, even if there's more data to come.

EDIT: Reading in chunks:

byte[] buffer = new byte[1024 * 32];
int bytesRead;

while ((bytesRead = stream.Read(buffer, 0, buffer.Length)) > 0)
{
    // Use the data you've read
}

Writing in chunks will depend on what you're writing... it's hard to talk about it in the abstract.

眉黛浅 2024-11-08 10:54:49

无需在一次调用中直接写入超过 2Gb 的数据

如果您确实在内存中连续缓冲了这么多数据(?可能作为不安全获取的 UnmanagedMemoryStream 来实现核心转储?),您可以轻松地在多个调用中批量写入。无论如何,在当前硬件上,它将以 512k 到最大 4k 的块写入磁盘。

“流”界面的巨大价值在于您可以通过任何方式获得它。事实上,当您查看它时,您会发现 CLR 数组(以及其他任何内容)实际上限制为 2GB

更新,

因为您现在已经承认您基本上想要复制流,那么即时解决方案可能会更好。有 File.Copy

File.Copy("file-a.txt", "file-new.txt");

或者有 标准答案

Stream input
input.CopyTo(output); // .NET 4.0

// .NET 3.5 and others
public static void CopyStream(Stream input, Stream output)
{
    byte[] buffer = new byte[32768];
    while (true)
    {
        int read = input.Read (buffer, 0, buffer.Length);
        if (read <= 0)
            return;
        output.Write (buffer, 0, read);
    }
}

不要忘记刷新关闭处置您的流如果您手动处理流,则合适

干杯

There is no need to directly write more than 2Gb of data in one call

If you would really have that amount buffered contiguously in memory (? maby as an unsafely acquired UnmanagedMemoryStream to implement a core dump?) you could easily batch the writes in multiple calls. It will get written to disk in blocks of 512k to max 4k on current hardware, anyway.

The great value of 'streaming' interfaces is that you can have it anywhich way. In fact, when you look into to it you will find that the CLR Arrays (and anything else) are actually bounded to 2GB

Update

Since you have now confessed that you basically want to copy streams, you might be better served with an instant solution. There is File.Copy

File.Copy("file-a.txt", "file-new.txt");

Or there is the standard answer

Stream input
input.CopyTo(output); // .NET 4.0

// .NET 3.5 and others
public static void CopyStream(Stream input, Stream output)
{
    byte[] buffer = new byte[32768];
    while (true)
    {
        int read = input.Read (buffer, 0, buffer.Length);
        if (read <= 0)
            return;
        output.Write (buffer, 0, read);
    }
}

Don't forget about Flushing, Closing and Disposing your Streams as appropriate if you are handling the streams manually.

Cheers

水水月牙 2024-11-08 10:54:49

据我所知,您仍然可以使用查找来到达流中的正确位置。

你可能需要循环才能做到这一点,如果你想阅读超过 2 场演出,你也需要在这里循环

as far as i know you can still use seek to get to the right position in the stream.

you will probably have to loop to do so, and if you want to read more than 2 gigs wou will need to loop here too

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文