通过 scp 定期将文件备份到另一台服务器
有一组日志文件的模式为 xxxxxYYY,其中 xxxx ->一些文本,YYY 是一个序列号,按顺序加一并环绕。在给定时间只有最后 n 个文件可用。
我想编写一个万无一失的脚本,以确保所有日志文件都备份在另一台服务器中(通过 ssh/scp)。
有人可以建议一个逻辑/代码片段(perl 或 shell)吗?
=>该脚本可以每隔几分钟运行一次,以确保流量突发不会导致日志文件丢失备份。
=>需要检测翻转,以便目标服务器/目录中的文件不会被覆盖。
->我在源或目标框中都没有超级用户。目的地盒子没有安装 rsync,并且安装它需要很长时间。 ->一次仅更新一个日志文件。
There are set of log files that have a pattern xxxxxYYY where xxxx -> some text and YYY is a sequence number increasing sequentially by one and wrapping around. Only the last n number of files are available at a given time.
I would like to write a foolproof script that would make sure that all the log files are backed up in another server (via ssh/scp).
Can somebody please suggest a logic/code snippet(perl or shell) for it?
=> The script can run every few minutes to ensure bursts of traffic do not cause log files to miss getting backed up.
=> The roll over needs to be detected so that files are not overwritten in the destination server/directory.
-> I do not have super user either in source or destination boxes. The destiantion box does not have rsync installed and would take too long to get it installed.
-> Only one log file gets updated at a time.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我会考虑让 cron 运行 rsync --backup 命令。
I would look at having
cron
run anrsync --backup
command.