C# 在不稳定网络上的鲁棒网络连接处理
我有很多代码片段可以正确地从 Internet 下载 x/html 页面和文件,并且它们在大多数情况下工作得很好。
不幸的是,我家的互联网连接在午夜到早上 6 点之间几乎无法使用。我遇到很多数据包丢失和连接停滞的情况。同样,我工作时有一个备用 3G 连接,但根据网络负载,该连接会间歇性中断。
当然,我可以将下载代码放入 try/catch 中 - 但理想情况下,我希望正确下载文件,无论需要多长时间。
我尝试了很多方法来解决这个问题,但效果参差不齐。
我尝试获取正在捕获的流的长度,并验证我的本地副本的长度是否相同。由于某种原因,(正确下载的)本地副本似乎总是比报告的流长度短几十个字节,但实际缩短的数量似乎有所不同。此外,并非所有流都有可用的长度。
我还尝试将我的 try/catch'd 段放入 for 循环中 - 使其尝试下载,例如 50 次。下载成功会中断,而失败则会重试。
我用逐渐增加的延迟来增强上述内容 - 我有一个 Thread.Sleep() ,它会在每次迭代中逐渐休眠越来越长的时间,然后再次尝试。
我想这个任务并不像听起来那么容易,因为我的很多个人软件——Steam、Chrome、Windows Update、iTunes 等……似乎根本无法处理我不稳定的连接。我经常需要尝试下载大文件 10-15 次才能成功下载。这些开发人员要么不关心处理连接问题,要么很难做到。
如果可能的话,有人可以提供一个代码片段,它可以从互联网上获取流,但可以处理不稳定的连接吗?
我想它应该是这样的:
public static MemoryStream Fetch(string url, int? maxRetries, int? maxDuration)
{
// Download the response of the url to a MemoryStream.
// Assume if null maxRetries and maxDuration, keep trying forever.
}
I've got quite a few code snippets which correctly download x/html pages and files off the Internet, and they - for the most part - work quite well.
Unfortunately, my home Internet connection becomes near-unusable between midnight and 6am. I get a lot of packet loss and stalled connections. Similarly, I have a backup 3G connection at work which has intermittent outages depending on network load.
Of course I can put the download code into a try/catch - but ideally I would like the the file to be correctly downloaded, regardless of how long it takes.
I have tried a number of things to resolve this, and have had mixed success.
I've tried getting the length of the stream I'm capturing, and verifying that my local copy is the same length. For some reason, the (correctly downloaded) local copy always seemed to be a few dozen bytes shorter than the reported length of the stream, but the actual number by which it was shorter seemed to vary. Furthermore, not all streams have the length available.
I also tried putting my try/catch'd segment inside a for loop - making it try the download, say, 50 times. A successful download would break, whereas a failed one would try again.
I augmented the above with a gradually increasing delay - I had a Thread.Sleep() which would gradually sleep for longer and longer at each iteration before trying again.
I imagine this task isn't as easy as it sounds, since a lot of my personal software - Steam, Chrome, Windows Update, iTunes, etc... can't seem to deal with my flaky connection at all. I frequently have to attempt to download large files 10-15 times before they come down successfully. Either these devs don't care about handling connection problems or it is difficult to do.
If it is possible, can someone please provide a code snippet which will grab a stream off the Internet but will handle a flaky connection?
I imagine it should be something like:
public static MemoryStream Fetch(string url, int? maxRetries, int? maxDuration)
{
// Download the response of the url to a MemoryStream.
// Assume if null maxRetries and maxDuration, keep trying forever.
}
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我建议研究现有的解决方案。我的前两个想法是使用 BITS 或访问 Robocopy。两者都经过严格审查,可以在可能不稳定的网络中下载文件(BITS 是 Windows Update 用于将更新下载到您的计算机的方式)。
I would suggest looking into an existing solution. My first two thoughts are to either use BITS or shell out to Robocopy. Both are well vetted in downloading files in potentially unstable networks (BITS is what Windows Update uses to download updates to your machine).