备份 appengine 数据存储区的推荐策略
现在,我每天晚上使用remote_api 和 appcfg.py download_data
拍摄数据库快照。需要很长时间(6小时)并且价格昂贵。如果不滚动我自己的基于更改的备份(我太害怕做类似的事情),确保我的数据不会出现故障的最佳选择是什么?
PS:我认识到谷歌的数据可能比我的安全得多。但如果有一天我不小心编写了一个程序将其全部删除怎么办?
Right now I use remote_api and appcfg.py download_data
to take a snapshot of my database every night. It takes a long time (6 hours) and is expensive. Without rolling my own change-based backup (I'd be too scared to do something like that), what's the best option for making sure my data is safe from failure?
PS: I recognize that Google's data is probably way safer than mine. But what if one day I accidentally write a program that deletes it all?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我想你已经基本确定了所有的选择。
download_data
执行完整备份,如果成本过高,频率可能低于每晚一次。选项3实际上是一个有趣的想法。您需要所有实体上的修改时间戳,并且您不会捕获已删除的实体,但否则使用remote_api和游标是非常可行的。
编辑:
这是一个与remote_api一起使用的简单增量下载器。再次强调,它不会注意到已删除的实体,并且假设所有实体都将上次修改时间存储在名为 Updated_at 的属性中。使用它需要您自担风险。
I think you've pretty much identified all of your choices.
download_data
, perhaps less frequently than once per night if it is prohibitively expensive.Option 3 is actually an interesting idea. You'd need a modification timestamp on all entities, and you wouldn't catch deleted entities, but otherwise it's very doable with remote_api and cursors.
Edit:
Here's a simple incremental downloader for use with remote_api. Again, the caveats are that it won't notice deleted entities, and it assumes all entities store the last modification time in a property named updated_at. Use it at your own peril.