远程备份文件更改检测有标准吗?
我需要一种有效的远程更改检测算法来备份普通文件系统。
文件备份到远程计算机,并且带宽非常宝贵,因此比较文件将很困难。我研究过远程差分压缩和 rsync,但我不知道应该从这里开始哪个方向。哪个带宽效率更高?商业备份软件有什么作用?是否有每个人都使用的标准算法?
I'm going to have a need for an efficient remote change detection algorithm for backup of an ordinary filesystem.
The files are backed up to a remote machine and bandwidth is at a premium, so it's going to be difficult to compare files. I've looked into Remote Differential Compression and rsync, but I don't know which direction I should go from here. Which is more bandwidth efficient? What does commercial backup software do? Is there a standard algorithm everyone uses?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我发现了两篇非常好的文章:
远程文件同步单轮算法解释并比较领先的方法非常有帮助。
低延迟远程文件同步算法深入介绍了远程文件的大量技术细节基于集合协调技术的同步。
I found two very good articles on this:
Remote File Synchronization Single-Round Algorithms explains and compares leading methods very helpfully.
Algorithms for Low-Latency Remote File Synchronization goes into lots of technical detail on remote file synchronization based on set reconciliation techniques.