将多个客户端的主目录同步到服务器

发布于 2024-07-16 06:53:45 字数 619 浏览 9 评论 0原文

我正在使用多台 Linux 笔记本电脑/台式机,并希望它们“共享”主目录。

不幸的是,NFS 不是一个选择。 因此,我试图使用 rsync 创建一个 bash 脚本,但我不知道该怎么做。

这是我现在的示例,

`#`!/bin/bash

sync() {
  rsync -azvR --exclude-from=/home/ME/.rsync_excludes --delete -e 'ssh -ax' $1 $2
}

sync /home/ME server.domain:/home/ME
`#`sync server.domain:/home/ME /home/ME

我认为如果我只使用一台客户端计算机来更新服务器文件,这会非常有效。 正确的?

如果我删除一个客户端中的文件怎么办? 该文件想要在其他客户端上删除(在同步之后)?

我可以使用 rsync 来实现此目的吗? 我应该寻找其他程序吗? 希望不是……

编辑:由于这个解决方案不应该只适合我,如果解决方案是自动的,我将不胜感激。

Edit2:也许必须有一个解决方案,包括以某种方式存储库。 Subversion、Git、Mercurial 或其他。

I'm using multiple Linux laptops/desktops and want them to "share" home directories.

NFS is unfortunately not an option. Therefor I was trying to create a bash script using rsync but I can't figure out how to do it.

This is my example right now

`#`!/bin/bash

sync() {
  rsync -azvR --exclude-from=/home/ME/.rsync_excludes --delete -e 'ssh -ax' $1 $2
}

sync /home/ME server.domain:/home/ME
`#`sync server.domain:/home/ME /home/ME

I think this would work great if I only where using one single client machine which updates the server files. Correct?

What if I delete a file in one client? That file want be deleted on the other client (after sync's of cause)?

Can I use rsync for this purpose? Should I look for an other program? Hopefully not though...

Edit: Since this solution shouldn't be only for me I would appreciate if the solution would be sort of automatically.

Edit2: Maybe there must be a solution including a repo in somehow. Subversion, Git, Mercurial or someting else.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

·深蓝 2024-07-23 06:53:45

rsync 非常适合保持一个位置与主设备同步。 或者换句话说,将 A 镜像到 B。但这不是您正在做的事情。 您必须将 A 同步到 B,将 B 同步到 A。这会带来一系列全新的问题。 如果文件消失了,您是否需要在另一端删除或重新同步它? 可能是对方修改了; 你无法检查。

反正; 这个问题的解决方案是unison。 这是一个工具(适用于 Linux、OS X、Windows、BSD...)(具有 CLI 工具、GUI 工具,并且可以在 cron 中很好地安排),它将保留您的主目录或任何其他目录很好地同步,并且能够处理几乎任何类型的冲突或问题。 那些人的想法比我们这里的要好得多。

另外,还有 SCM。 许多人使用 SCM 来管理他们的主目录。 Subversion 为此很受欢迎,但我根本不推荐它。 它不仅会消耗大量空间,使一切变得非常慢,并迫使您依赖与主存储库的活动连接来保持同步。 还有一些替代方案,例如 GIT 等,但它们都有其缺点。

不管怎样,任何基于 SCM 的解决方案都违反了 SCM 的一条非常大的规则:永远不应该在其中保留大的二进制数据。 SCM 不是为此而设计的。 您不会将照片集、电影、文档、下载以及类似的内容保存在 SCM 中,即使您可能希望使它们保持同步或保留它们的历史记录(尤其是图片/文档)。

重要的是要了解保持备份和保持同步之间存在差异。 您的备份应保存在远程/独立位置,并且可以包含您拥有的所有内容的历史记录。 我个人推荐 rdiff-backup 。 它完美地保存了所有内容的历史记录,在底层使用 rsync 算法来最大限度地减少流量,并且访问备份位置看起来就像备份的最新状态:您可以像浏览普通文件一样浏览它。

总而言之,我建议您将 unisonrdiff-backup 提供全面的解决方案,确保数据安全可靠地同步。

rsync is good to keep one location in sync with a master. Or in other terms, mirror A to B. That's not what you're doing, though. You'd have to rsync A to B and B to A. Which brings a whole new set of problems. If a file disappeared, do you need to delete in on the other side or rsync it back? Maybe it was modified on the other side; you can't check.

Anyway; the solution to this problem comes in the form of unison. That's a tool (works on Linux, OS X, Windows, BSD, ...) (has CLI tools, GUI tools, and can be scheduled nicely in cron) which will keep your home directory or any other directory nicely in sync, and is made to be able to deal with almost any type of conflict or problem. Those people thought it all out way better than we could here.

Alternatively, there's SCMs. Many people use SCMs for managing their home directories. Subversion is popular for this, but I wouldn't recommend it at all. It will not only consume massive amounts of space, make everything horribly slow and force your keeping in sync on depending on an active connection to the master repository. There's alternatives, like GIT, and others, but they all have their downsides.

Either way, any SCM-based solution violates one very big rule of SCMs: You should never keep big binary data in there. SCMs are not made for this. You don't keep your photo collections, movies, documents, downloads, and stuff like that in an SCM, even though you may want to keep them in sync or keep a history on them (especially so for pictures/documents).

It's important to understand that there is a difference between keeping backups and keeping in sync. Your backups should be kept in a remote/detached location and can contain a history of everything you own. I personally recommend rdiff-backup for this. It keeps history of everything beautifully, uses the rsync algorithm under the hood to minimize traffic and accessing the backup location looks like the most current state of the backup: You can just browse through it like you do normal files.

To summarize, I recommend you combine unison and rdiff-backup for an all-round solution to keeping your data safe and reliably in sync.

茶色山野 2024-07-23 06:53:45

为什么不使用 Subversion 来执行此操作? 链接的文章详细介绍了作者如何使用源代码管理同步和存储历史记录(显然,您不必使用 Subversion - 还有其他选择)

Why not do this using Subversion ? The linked article details how the author synchronises and stores history using source control (you don't have to use Subversion, obviously - there are alternatives)

囚你心 2024-07-23 06:53:45

看看您所做的事情,这应该可行......您只需要确保每个客户端在完成工作后同步到服务器即可。 我使用以下内容,我在每个目录的基础上手动调用:

function syncDown() {
    f=${1/\\/$/}/;
    rsync -acuvz --exclude 'CVS' --exclude '*.class' --exclude '.classpath' server:projects/$f $f;
    }

function syncUp() {
    f=${1/\\/$/}/;
    rsync -acuvz --exclude 'CVS' --exclude '*.class' $f server:projects/$f;
    }

如果您正在寻找无人值守的自动同步,那么您将不会得到它:您在一个客户端上工作时总是会遇到竞争条件但该工作会被另一项同步覆盖。

Looking at what you've done, this should work ... you just need to ensure that each client gets synced to the server after you're finished working on it. I use the following, which I invoke manually on a per-directory basis:

function syncDown() {
    f=${1/\\/$/}/;
    rsync -acuvz --exclude 'CVS' --exclude '*.class' --exclude '.classpath' server:projects/$f $f;
    }

function syncUp() {
    f=${1/\\/$/}/;
    rsync -acuvz --exclude 'CVS' --exclude '*.class' $f server:projects/$f;
    }

If you're looking for unattended, automated synchronization, then you're not going to get it: you'll always have race conditions where you work on one client but that work gets overwritten by a sync from another.

清风挽心 2024-07-23 06:53:45

看起来您可能已经知道这一点,但是,只是为了向将来可能会看到此问题的人强调这一点:

rsync 只进行单向同步。 如果你想要双向同步,你需要使用其他东西。 (cvs/svn/git/etc.将是适当的“其他东西”,但版本控制系统可能不是最佳选择如果您不需要更新历史记录。)

实际上,这意味着如果您从 A 到 B 进行 rsync,那么每次同步都会使 B 上的目录看起来与 B 上的目录完全相同A - 自上次同步以来对 B 所做的任何更改都将丢失(除非排除,并且需要注意的是,如果指定了 --delete,则 rsync 只会删除文件)。 这种有权威的主版本然后推送到其他地方的安排在很多情况下都是合适的,但任何形式的协作工作都不在其中。

It looks like you probably already know this, but, just to emphasize the point for those who may see this question in the future:

rsync only does one-way synchronization. If you want bi-directional sync, you need to use something else. (cvs/svn/git/etc. would be appropriate "something else"s, but a revision control system may not be the optimal choice if you don't need an update history.)

In practical terms, this means if you're rsyncing from A to B, then each sync will make the directory on B look exactly like the directory on A - any changes made on B since the last sync will be lost (barring excludes and with the caveat that rsync will only delete files if --delete is specified). This sort of arrangement with an authoritative master version which is then pushed out to other locations is appropriate in many cases, but any sort of collaborative work is not among them.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文