Rails 数据库备份脚本
我目前使用下面的脚本来备份网站,但它可以得到显着改进!请问您能提出任何改进建议,或者替代解决方案吗?
目前,我仅在达到大量后才删除项目 - 这不好。有谁知道如何删除一个月前的项目,或者在有五十个备份时开始删除并首先开始删除最旧的项目?
require 'find'
require 'ftools'
namespace :db do desc "Backup the database to a file. Options: DIR=base_dir
RAILS_ENV=development MAX=20"
task :backup => [:environment] do
datestamp = Time.now.strftime("%d-%m-%Y_%H-%M-%S")
base_path = ENV["DIR"] || "db"
backup_base = File.join(base_path, 'backup')
backup_folder = File.join(backup_base, datestamp)
backup_file = File.join(backup_folder, "#{RAILS_ENV}_dump.sql.gz")
File.makedirs(backup_folder)
db_config = ActiveRecord::Base.configurations[RAILS_ENV]
sh "mysqldump -u #{db_config['username'].to_s} #{'-p' if db_config[
'password']}#{db_config['password'].to_s} --opt #{db_config['database']} |
gzip -c > #{backup_file}"
dir = Dir.new(backup_base)
all_backups = (dir.entries - ['.', '..']).sort.reverse
puts "Created backup: #{backup_file}"
max_backups = ENV["MAX"] || 10000000
unwanted_backups = all_backups[max_backups.to_i..-1] || []
for unwanted_backup in unwanted_backups
FileUtils.rm_rf(File.join(backup_base, unwanted_backup))
puts "deleted #{unwanted_backup}"
end
puts "Deleted #{unwanted_backups.length} backups, #{all_backups.length -
unwanted_backups.length} backups available"
end
end
I currently use the script below to back-up a website but it could be improved dramatically! Please could you suggest any improvements, or perhaps alternative solutions?
Currently, I only delete items after a massive amount has been reached - and this is not good. Does anyone know how I can delete items that are a month old, or start deleting when there are fifty backups and start deleting the oldest items first?
require 'find'
require 'ftools'
namespace :db do desc "Backup the database to a file. Options: DIR=base_dir
RAILS_ENV=development MAX=20"
task :backup => [:environment] do
datestamp = Time.now.strftime("%d-%m-%Y_%H-%M-%S")
base_path = ENV["DIR"] || "db"
backup_base = File.join(base_path, 'backup')
backup_folder = File.join(backup_base, datestamp)
backup_file = File.join(backup_folder, "#{RAILS_ENV}_dump.sql.gz")
File.makedirs(backup_folder)
db_config = ActiveRecord::Base.configurations[RAILS_ENV]
sh "mysqldump -u #{db_config['username'].to_s} #{'-p' if db_config[
'password']}#{db_config['password'].to_s} --opt #{db_config['database']} |
gzip -c > #{backup_file}"
dir = Dir.new(backup_base)
all_backups = (dir.entries - ['.', '..']).sort.reverse
puts "Created backup: #{backup_file}"
max_backups = ENV["MAX"] || 10000000
unwanted_backups = all_backups[max_backups.to_i..-1] || []
for unwanted_backup in unwanted_backups
FileUtils.rm_rf(File.join(backup_base, unwanted_backup))
puts "deleted #{unwanted_backup}"
end
puts "Deleted #{unwanted_backups.length} backups, #{all_backups.length -
unwanted_backups.length} backups available"
end
end
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
我们使用这个脚本,它不像你的那么复杂,但或多或少做同样的事情:
然后我们使用 cron 定期运行它(4 次/天,但显然我们只保留每个脚本中最新的一个)天,因为每天较晚的内容会覆盖较早的内容)。它可以保留两天的备份;我们有一台远程服务器,它使用
scp
每天两次复制整个/path/to/backups/dbs/
目录,并且该服务器会保留备份,直到我们有时间刻录为止将它们刻录到 DVD-ROM 上。请注意,如果它错过了删除,文件将挂起相当长一段时间——该脚本仅删除“昨天的”文件,而不是您的脚本所做的“所有早于 X 的文件”。但您也许可以从中汲取一些想法并将其合并到您的脚本中。
We use this script, which isn't quite as complex as yours but does more or less the same thing:
We then run that with cron on a regular basis (4x/day, but obviously we only keep the most-recent one from each day, because later ones for each day will overwrite earlier ones). It keeps two days worth of backups; we have a remote server which uses
scp
to copy the entire/path/to/backups/dbs/
directory twice daily, and that one keeps backups until we have time to burn them to DVD-ROM.Notice that if it misses a deletion the file will hang around for quite a while--the script only deletes "yesterday's" file, not "all files older than X," which your script does. But you can probably take some ideas from this and incorporate them in your script.
为什么不将 git 与 cron 作业一起使用?
git 设置:
cron 作业:
不删除文件,所有 mysql 转储的历史记录取决于 cron 作业执行时间...
why dont use git with cron job ?
git setup:
cron job:
no deleting file, history for ALL mysqls dumps depending on cron job execution times...
既然您已经将时间戳放入备份文件夹名称中,为什么不解析文件夹名称并删除时间戳早于 30 天的内容呢?
Since you already put timestamp in your back up folder name, why don't you parse folder name and delete whatever has timestamp that's older than 30 days?