用于删除 Git 中所有二进制文件的历史记录的脚本,而不删除文件本身
假设我有一个巨大的 git 存储库,里面有许多 swf 和图像。我希望它们包含在托管的 github 存储库中,但它们不需要版本控制,而且我不想将它们存储在其他地方。
每次提交到存储库时删除其历史记录的最简单方法是什么? ...这样,最终我拥有了所有的 swf 和图像,但没有它们的历史记录。
编辑:swf 文件可能会经常更改,因此我们可以相信每次提交都有不同的版本。
Say I have a huge git repository and it has a number of swfs and images in there. I want them to be included in the hosted github repository, but they don't need to be versioned, and I don't want to have to store them somewhere else.
What is the simplest way I can remove their history every time I commit to a repository? ...Such that, in the end, I have all the swfs and images, but no history for them.
Edit: The swf files are likely to change often so we can count on there being different versions for each commit.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
不是直接答案,但我不确定这里是否存在问题:
如果您的 swfs 和图像文件不移动,则它们在提交后将具有相同的 SHA1 提交。它们将占用相同的磁盘空间并引用相同的 blob。
根据 GitPro 书籍:
如果您的“资源”文件(swf 和图像)及时演变,记录它们的历史记录会很有趣,以便能够及时返回并查看连贯的配置(即当时有效的源+资源)
Not a direct answer, but I am not sure if there is a problem here:
If your swfs and image files do not move, they will have the same SHA1 commit after commit. They will occupy the same disk space and refer to the same blob.
according to GitPro book:
And if your "resources" files (swf and images) evolve in time, recording their history is interesting to be able to get back in time and see a coherent configuration (i.e. source + resources valid at the time)
如果不破坏整个存储库时间线的 SHA1 完整性,就不可能删除“历史记录”。这是 Git 最强大的功能之一:每个提交 ID 都是根据其完整历史时间线构建的哈希值。
但据我所知,二进制对象存储在某种差异中,因此不会浪费太多存储空间 - 如果这是您关心的问题。
否则,如果您只想获取大的二进制数据并将其托管在其他地方,请尝试使用子模块。
It won't be possible to remove "history" without destroying the SHA1 integrity of the complete repository timeline. This is one of Git strongest feature: Each commit id is a hash built from its complete historical timeline.
But as far as I know binary objects are stored in some sort of diff so it won't waste too much storage - if that is your concern.
Otherwise try to work with submodules if you just want to get the big binary data out of the way and host that elsewhere.