通过 Internet 将大型二进制文件从一台电脑发送到另一台电脑的最快方法是什么?
我需要通过互联网将大型二进制(2Gb-10Gb)数据从一台电脑(客户端)发送到另一台电脑(服务器)。首先,我尝试使用 IIS 中托管的 WCF 服务,使用 wsHttpBinding 与消息安全绑定,但它花费了很多时间(几天),这对我来说是不合适的。现在我考虑使用套接字编写客户端和服务器应用程序。会更快吗?
最好的方法是什么?
谢谢
I need to send large binary(2Gb-10Gb) data from one pc(client) to another pc(server) over the Internet. First I tried to use WCF service hosted in IIS using wsHttpBinding binding with message security but it took a lot of time (a few days) which is a inappropriate for me. Now i think about writing client and server applications using sockets. Would it be faster?
What is the best way to do it?
Thanks
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(5)
对我来说,普通的老式 FTP 就适合这种情况。通过使用它,您将有机会恢复中断的传输,而无需从头开始重做作业。您需要考虑到如此大量的下载可能会因某些原因而中断。
The plain old FTP in order to me would be suitable in this case. By using it you will have the chance to recover an interrupted transfer without need to redo de job from start. You need to keep in account the possibility a so massive download get interrupted for some reasons.
发送大量数据时,您会受到连接带宽的限制。您应该注意连接中断的情况。如果您必须重新发送大量数据,那么小的中断可能会产生很大的影响。
您可以使用BITS,这会在后台传输数据,并将数据分成块。所以它会为你处理很多事情。
它依赖于IIS(在服务器上),并有一个客户端(API)来传输数据。因此您不需要读取或写入数据传输数据的基础知识。
我不知道它是否会更快,但至少比发出单个 HTTP 或 FTP 请求更可靠。而且你可以让它运行得非常快。
如果带宽是个问题,并且不必通过互联网发送,您可以检查高带宽/低延迟连接,例如通过快递发送 DVD。
您可以使用 .Net 中的 BITS,在 CodeProject 上有包装器。
When sending large amounts of data, you are limited by the bandwidth of the connection. And you should take care of disruptions in the connection. Small disruptions can have a big impact if you have to resend a lot of data.
You can use BITS, this transfers the data in the background, and divides the data into blocks. So it will take care of a lot of stuff for you.
It depends on IIS (on the server), and has a client (API) to transfer the data. So you do not need to read or write the basics of the data transferring the data.
I don't know if it will be faster, but at least a lot more reliable as making a single HTTP or FTP request. And you can have it running very fast.
If bandwidth is a problem, and it doesn't have to be send over the internet, you could check out high-bandwidth/low-latency connections like sending a DVD by courier.
You can use BITS from .Net, on CodeProject there is wrapper.
好吧,带宽是你的问题,甚至更低的套接字也不会帮助你太多,因为 WCF 开销对于长二进制响应没有多大作用。也许您的选择是使用一些无损流压缩算法?前提是您的数据是可压缩的(使用 zip 进行试运行,如果它缩小了本地磁盘上的文件,您可以找到合适的流算法)。顺便说一句,我建议提供简历支持:)
Well, the bandwidth is your problem, going even lower into sockets won't help you much there as WCF overhead doesn't play much with long binary responses. Maybe your option is to use some lossless streaming compression algorithm? Provided that your data is compressible (do a dry run using zip, if it shrinks a file on local disk you can find a suitable streaming algorithm). Btw, I would suggest providing a resume support :)
通常,利用已经为此类事情编写的东西是最合适的。例如 FTP、SCP、rsync 等
FTP 支持在下载中断时恢复,但不确定是否支持恢复上传。 Rsync 在这类事情上要好得多。
编辑:
也许值得考虑一些我不太熟悉但可能是另一种选择的东西 - Bit-Torrent?
另一个选择是使用 UDT 等协议库来滚动您自己的客户端/服务器,这将为您提供比 TCP 更好的性能。请参阅:http://udt.sourceforge.net/
Usually it's most appropriate to leverage something that's already been written for this type of thing. e.g. FTP, SCP, rsync etc
FTP supports resuming if the download broke, although not sure if it supports a resumed upload. Rsync is much better at this kind of thing.
EDIT:
It might be worth considering something that I'm not terribly familiar with but might be another option - bit-torrent?
A further option is to roll your own client/server using a protocol library such as UDT which will give you better than TCP performance. See: http://udt.sourceforge.net/
尽管存在与更高级别框架相关的一些带宽开销,但我发现 WCF 文件作为流传输的速度非常快。通常与通过 SMB 进行常规文件传输一样快。我在一个会话中传输了数十万个小文件,其中包括 6-10GB 的较大文件,有时甚至更大。从来没有因为任何良好的连接而出现过任何重大问题。
我真的很喜欢它提供的界面。允许您执行一些 FTP 无法执行的非常酷的操作,例如远程处理或双工端点。您可以通过编程控制双方连接的各个方面,并且它们可以随文件一起传递消息。有趣的东西。
是的,如果您不需要所有这些东西,FTP 既快速又简单。
Although there is some bandwidth overhead associated with higher level frameworks, I have found WCF file transfer as a stream to be more than adequately fast. Usually as fast as a regular file transfer over SMB. I have transferred hundreds of thousands of small files in a session, which included larger files 6-10gb sometimes larger. Never once had any major issues over any sort of decent connection.
I really like the interfaces that it provides. Allows you to do some pretty cool stuff that FTP cant, like remoting, or duplex end points. You get programmatic control over every aspect of the connection on both sides, and they can communicate messages along with the files. Fun stuff.
Yes FTP is fast and simple, if you don't need all that stuff.