自动检测文件更改并通过 S3 同步
我在 Linux 系统上有一个媒体文件的本地目录,我使用 s3sync 脚本与 Amazon S3 账户同步。目前,当我知道媒体文件已被修改时,我正在手动运行 s3sync 脚本。
文件修改时如何自动运行脚本?
我正在考虑创建一个 cron 作业来每隔几分钟运行一次脚本,但这似乎是过多的处理量,因为即使没有更改,脚本仍然必须扫描整个目录结构,这是相当大的。
我还考虑过 incron/inotify,它允许在特定文件或目录更改时运行命令,但这些工具似乎不会自动支持监视整个嵌套目录的更改。如果我错了,请纠正我,但 incron/inotify 似乎只能监视它们被明确告知要监视的文件。例如,如果我想监视目录内任何级别的所有文件的更改,我必须编写单独的脚本来监视目录和文件添加/删除,以更新 incron 监视的文件和目录列表。
有更有效的解决方案吗?
I have a local directory of media files on a Linux system, which I synchronise with an Amazon S3 account using an s3sync script. Currently, I'm manually running the s3sync script when I know the media files have been modified.
How would I automatically run the script when files are modified?
I was thinking of creating a cron job to run the script every few minutes, but that seems like an excessive amount of processing, because even if there are no changes, the script still has to scan the entire directory structure, which is quite large.
I also considered incron/inotify, which allows running commands when a specific file or directory changes, but these tools don't seem to automatically support monitoring changes to the entirety of a nested directory. Correct me if I'm wrong, but it seems that incron/inotify can only monitor files they've been explicitly told to monitor. e.g. If I wanted to monitor changes to all files at any level inside a directory, I'd have to write separate scripts to monitor directory and file additions/deletions to update the list of files and directories monitored by incron.
Are there more efficient solutions?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
对于此类任务,我使用 fssm gem。
然后创建文件 watcher.rb
:
当然,如果你愿意,你可以妖魔化它。
For this kind of tasks, I'm using fssm gem.
create file watcher.rb
then:
Of course you can demonize it, if you want.
以下是您可以使用的示例场景并利用简单的 rsync 脚本。
http://andrewwilkinson.wordpress.com/2011/ 01/14/rsync-backups-to-amazon-s3/
基本上意味着使用 fusion 和 s3fs ( http://code.google.com/p/s3fs/ ) 将 s3 共享挂载为目录在本地文件系统上并使用 rsync 来同步 2. 简单的 cron 作业就可以了。
Here is a sample scenario you might use instead and utilize simple rsync script.
http://andrewwilkinson.wordpress.com/2011/01/14/rsync-backups-to-amazon-s3/
Basically means using fuse and s3fs ( http://code.google.com/p/s3fs/ ) to mount s3 share as a directory on your local filesystem and use rsync to sync the 2. Simple cron job would do the trick.
现在有一个有效的解决方案。这刚刚宣布(早就该宣布了):
http://aws.amazon.com/blogs/aws/s3-event- notification/
实现起来非常简单 - 是时候扔掉所有丑陋的 cron 作业和列表循环了。
Now there is an efficient solution. This was just announced (long overdue):
http://aws.amazon.com/blogs/aws/s3-event-notification/
It is very simple to implement - time to throw out all the ugly cron jobs and list-loops.