如何使用 WCF 服务通过 SOAP 将大型 Zip 文件 (50MB) 传输到任何客户端?

发布于 2024-09-10 11:04:36 字数 862 浏览 2 评论 0原文

我有一个 WCF 服务,它向任何请求它的客户端返回一个带有 Zip 文件 (50MB) 的字节数组。如果 Zip 非常小(例如 1MB),则 SOAP 响应来自 WCF,其中嵌入了字节数组。但即使对于 1MB 的文件,响应大小也非常大。如果我尝试传输 50MB 的文件,服务将挂起并抛出内存不足异常,因为 SOAP 响应的大小变得巨大。

  1. 当我发回字节数组时,WCF/Web 服务传输大文件(主要是 ZIP 格式)的最佳选项是什么。除了发回文件之外,还有什么好的方法吗? 。

  2. WCF/Web 服务是否是将大文件传输到任何客户端的最佳方式,或者是否有其他更好的选项/技术可以实现 10,000 个用户的互操作性和可扩展性?

我的 Ccode 如下:

        String pathfordownload = @"D:\New Folder.zip";
        FileStream F2D = new FileStream(pathfordownload, FileMode.Open,FileAccess.Read);
        BinaryReader binReader = new BinaryReader(F2D);
        binReader.BaseStream.Position = 0;
        byte[] binFile = binReader.ReadBytes(Convert.ToInt32 (binReader.BaseStream.Length));
        binReader.Close();
        return binFile;

一份工作/真实的信息将非常有帮助,因为我正在努力处理 Google 中的所有可用数据,并且上周没有得到好的结果。

I have a WCF Service that returns a byte array with a Zip file (50MB) to any client that requests it. If the Zip is very small (say 1MB), the SOAP response is coming from WCF with the byte array embedded in it. But the response size is very huge even for a 1MB file. If I try to transfer the 50MB file the service hangs and throws an out of memory exception, because the SOAP response becomes huge in size.

  1. What is the best option available with WCF / web service to transfer large files (mainly ZIP format) as I am sending back a byte array. Is there any good approach instead of that for sending back the file?

  2. Whether WCF / web service is best way to transfer large files to any client or is there any other better option/technology available so that interoperability and scalability for 10,000 users can be achieved?

My Ccode is below:

        String pathfordownload = @"D:\New Folder.zip";
        FileStream F2D = new FileStream(pathfordownload, FileMode.Open,FileAccess.Read);
        BinaryReader binReader = new BinaryReader(F2D);
        binReader.BaseStream.Position = 0;
        byte[] binFile = binReader.ReadBytes(Convert.ToInt32 (binReader.BaseStream.Length));
        binReader.Close();
        return binFile;

A working piece/real piece of information will be really helpful as I am struggling with all the data available in Google and have had no good results for last week.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

梦醒灬来后我 2024-09-17 11:04:36

您可以传输通过WCF流​​,然后您可以发送(几乎)无限长度的文件。

You can transfer a Stream through WCF and then you can send (almost) limitless length files.

仙气飘飘 2024-09-17 11:04:36

我也遇到过完全相同的问题。内存不足是不可避免的,因为您使用的是字节数组。

我们所做的就是刷新硬盘驱动器上的数据,因此您的并发事务容量不再受到虚拟内存的限制,而是硬盘空间。

然后为了传输,我们只是将文件放在另一台计算机上。当然,在我们的例子中,这是服务器到服务器的文件传输。如果想与peer解耦,可以使用http中的文件下载。

因此,您的服务可以使用指向文件位置的 http url 进行响应,而不是使用文件进行响应。然后,当客户端使用标准 HttpRequest 或 WebClient 成功从服务器下载时,它会调用一个方法来删除文件。在 SOAP 中,可以是 Delete(string url),在 REST 中,可以是资源上的删除方法。

我希望这对你有意义。其中最重要的部分是要了解,在可扩展软件中,尤其是当您查看 10000 个客户端(并发?)时,您可能不会使用有限的资源,例如内存流或字节数组。而是依赖大型且易于扩展的资源,例如最终可能位于 SAN 上的硬盘驱动器分区,并且 IT 可以根据需要扩展分区。

I've faced the exact same problem. The out of memory is inevitable because you are using Byte arrays.

What we did is to flush the data on the hard drive, so in stead of being limited by your virtual memory your capacity for concurrent transactions is the HD space.

Then for transfer, we jut placed the file on the other computer. Of course in our case it was a server to server file transfer. If you want to de decoupled form the peer, you can use a file download in http.

So instead than responding with a file, your service could respond with a http url to the file location. Then when the client has successfully downloaded form the server with a standard HttpRequest or WebClient it calls a method to delete the file. In SOAP that could be Delete(string url), in REST that would be delete method on the resource.

I hope this makes sense to you. The most importnat part of this is to understand that in a scalable software especially if you are looking at 10000 clients (concurrent?) is that you may not use resources that are limited, like memory streams or byte arrays. But rather rely on large and easily expandable resources like a hard drive partition that coule eventually be on a SAN and IT could grow the partition as needed.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文