我知道您可以运行rsync命令将文件夹和文件发送到另一个文件夹和文件,但是由于我开发的位置以及我的github应用程序,我正在尝试找到将本地文件同步到我的github本地文件夹的最佳方法我的文件夹位于两个不同的位置。
-
例如,开发人员文件夹是
/user/(用户名)/nodejs/(app name)
-
github文件夹是
/user/(用户名)/github/(git名称)
是否有两个文件夹同步?
例如,如果我更改(应用程序名称)文件夹,我希望它在GitHub文件夹中进行更新。
如果来自GitHub的拉请请求,该请求更新了(GIT名称)文件夹,我希望将其发送到(App Name)文件夹。
我知道您可以做到:
rsync -a source dest
但是,这将复制所有内容。
另外,另一个问题是拉动请求中有一个错误(我知道作为开发人员,我可以修复该错误,然后将该修复程序推到github,但我不想遇到我可能会搞砸的情况我的(应用程序名称)文件夹必须从Github和Re Code工作中获得最新版本,
我也知道我可以使用类似的内容:
rsync -v -a --ignore-existing /(APP NAME)/ /(GIT NAME)/
I know that you can run Rsync commands to send folders and files to another folder and file, but I am trying to work out the best way to sync local files to my GitHub local folder due to where I develop and where my GitHub app has filed my folders are in two different locations.
-
Developer folder for example is
/user/(username)/nodejs/(app name)
-
GitHub folder is
/user/(username)/github/(git name)
Is there anyway to have the two folders synced?
For example, if I make a change in my (app name) folder I want it to update in the GitHub folder.
If a pull request from GitHub which updates the (git name) folder I want that sent to the (app name) folder.
I know that in basic you could do:
rsync -a source dest
However that will copy everything.
Also the other issue is what is the pull request has a bug in it (I know as a developer I can fix the bug and then push that fix to the GitHub but I don't want to run into a situation where I potentially screw up my (app name) folder to the point I have to get the latest version from GitHub and re code work I have done.
I also know I could use something like this:
rsync -v -a --ignore-existing /(APP NAME)/ /(GIT NAME)/
发布评论
评论(2)
请尝试一下,与删除所有其他删除的源文件的良好远程同步,一个用于压缩的好镜子添加
-z
flag,- delete-delay
等待等待直到复制文件...干燥/测试:
我意识到您没有要求,但是为了帮助您的情况,我建议您进行干式运行[A-存档,n-Dryrun,v-verobse],
-a
使复制递归并保留像修改时间之类的原始道具,同时还要复制其符合Symlinks的符号链接,保留权限,保留所有者和组信息,并保留设备和特殊文件...在处理远程服务器时
杂项。用于IP的远程服务器
please try this, a good remote sync with delete any additional removed source files, a good mirror for compression add
-z
flag,--delete-delay
wait till files are copied...Dry Runs/Testing:
I realize you didn't ask, but to help your case, may I recommend a dry run [A-archive,n-DryRun,V-Verobse],
-a
makes the copy recursive and keeps original props like modification times, while also copying the symlinks that it encounters as symlinks, preserve the permissions, preserve the owner and group information, and preserve device and special files...When dealing with Remote Servers
Additional Misc. for Remote servers with IP
只想将其扔进混合物,以防万一它有所帮助...不久前我必须做一些非常相似的事情。我不想将有关目录添加到GIT存储库中,因为它是4GB,但是在我的情况下,这两个服务器是遥远的。
我不使用
rsync ...
这是一种很好的方法,而是去了AWS S3 ...真的很容易同步,只需要对IAM进行一些修补即可获得权限拨入。...然后使用 aws aws cli我使用了S3同步功能,该功能可以轻松清洁所有已删除的文件等。
要上传您的文件:
然后在开发服务器或生产上下载:
...然后我能够将S3存储桶链接到我的CDN(CloudFront)以提供这些较大的文件。
我希望这会有所帮助。
Just wanted to throw this into the mix in case it helps... I had to do something very similar not so long ago. I didn't want to add the directory in question to the git repo because it was 4GB, however in my situation the two servers were remote.
Rather than using
rsync...
which is a great approach, I went for AWS S3... it was really easy to sync to, and just needed a bit of tinkering with IAM to get the permissions exactly dialled in.... then using the AWS CLI i used the S3 sync feature which easily cleans up any deleted files and the like.
To upload your files:
Then to download on dev servers or production:
... I was then able to link the S3 bucket to my CDN (CloudFront) to serve these larger files.
I hope this helps.