在 Unix 上创建多部分存档的最简单方法是什么?
tar|gzip
非常棒,只是文件可能会变得太大,并且通过网络传输它们会变得很复杂。 DOS 时代的归档程序通常用于创建多部分归档文件,每个软盘一个,但 gzip 似乎没有这样的选项(因为 Unix 流哲学)。
那么在 Linux 下执行此操作最简单、最可靠的方法是什么(显然存档大小约为 2GB,而不是 1.44MB)?
tar|gzip
is wonderful, except files can get too big, and transferring them over network gets complicated. DOS era archivers were routinely used to create multipart archives, one per floppy, but gzip doesn't seem to have such option (because of Unix streaming philosophy).
So what's the easiest and most robust way of doing this under Linux (and obviously with archive size ~2GB, not 1.44MB)?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(4)
您可以使用
/usr/bin/split
(使用“-b”选项)将其分成几部分 - 读取“man split”you could split it up into pieces by using
/usr/bin/split
(with the "-b" option) - read 'man split'我不再费心使用 gzip 进行归档,只是为了解压其他人尚未转换的档案:-)
7zip 具有疯狂级别的压缩(尽管我还没有在所有情况下都将其正面对比) )并且它还支持创建卷,这可以回答您的具体问题。
例如,以下命令将当前目录树压缩为名为
/backups/2021_09_28.7z.NNN
的 1G 卷,其中NNN
范围从001
到无论它需要什么值:I don't bother using gzip for archiving any more, just for unpacking other people's archives who haven't yet been converted :-)
7zip has insane-level compression (although I haven't put it head-to-head in all scenarios) and it also supports creating volumes, which is in answer to your specific question.
For example, the following command compresses the current directory tree into 1G volumes called
/backups/2021_09_28.7z.NNN
whereNNN
ranges from001
to whatever value it needs:典型的 Unix 解决方案是“split -b”,但这个选项不是很强大。 如果任何文件损坏或丢失,您将失去所有内容。
您可以将 split 与 bzip2 结合使用,这通常能够在一定程度上修复损坏的存档。
更安全的方法是使用 parchive(更具体地说是 PAR2)。 它将创建 RAID 样式的附加文件,以从文件部分的任何损坏中恢复。 有关更多信息,请查看 Linux 发行版上的 Quickpar、parchive.sf.net、par2 软件包...
The typical Unix solution would be "split -b" but this option is not very robust. If any of the files is damaged or lost, you're losing everything from there on.
You could use split in conjunction with bzip2 that is often able to repair a broken archive to a certain degree.
A much safer way would be to use parchive (PAR2 more specifically). It will create additional files in RAID-style to recover from any damage to files sections. For more info, look at quickpar, parchive.sf.net, par2 package on Linux distribs...
您可以使用 tar 并生成多部分存档。 技巧是您必须指定您期望生成的尽可能多的存档文件的名称,每个文件都有一个单独的 -f 标志。 幸运的是,有一种简单的方法可以做到这一点。 假设您希望获得不超过 10 个存档文件。 然后你可以使用这个命令:
这将扩展到 10 个 -f 标志,每个标志都有一个编号的存档文件名。 要提取,请使用
唯一的问题是您无法使用 tar 多重归档来压缩 (-z) 文件。
You can use tar and generate a multipart archive. The trick is that you have to specify the names of as many archive files as you expect to generate, each with a separate -f flag. Fortunately, there is an easy way to do this. Say you expect to get no more than 10 archive files. Then you can use this command:
This will expand to 10 -f flags each with a numbered archive file name. To extract, use
The only catch is you can't compress (-z) the files using this multi-archiving with tar.