git 存储库可以处理的提交数量是否有上限?

发布于 2024-12-03 13:51:10 字数 299 浏览 3 评论 0原文

我想知道 git 存储库可以处理的提交数量是否有上限。

在我现在正在处理的一个单独项目中,我一直在本地编码,在 git 中提交/推送更改,然后将更改拉到我的开发服务器上。

我将其视为本地工作和通过 FTP 上传更改的更简单的替代方案...幸运/不幸的是,这是一个如此简单的工作流程,我有时会经历很多编辑/提交/推送/拉取/浏览器刷新编码时循环。

我想知道这是否会反过来并在某个地方咬我。如果这可能是一个问题,我想知道如何避免这种麻烦......看起来 rebase 可能是可行的方法,特别是因为我不必担心冲突的分支等。

I'm wondering if there's an upper limit to the number of commits that a git repository can handle.

In a solo project I'm working on right now, I've been coding locally, committing/pushing changes in git, then pulling the changes on my development server.

I treat this as an easier alternative to working locally and uploading changes via FTP... Fortunately/Unfortunately it's such an easy workflow that I sometimes go through many edit/commit/push/pull/browser-refresh cycles while coding.

I'm wondering if this is going to turn around and bite me somewhere down the line. If it's likely to be a problem, I'm wondering how I can avoid that trouble ... It seems like a rebase might be the way to go, especially since I won't have to worry about conflicting branches etc.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

我只土不豪 2024-12-10 13:51:10

好吧,“上限”可能是 SHA1 冲突发生的点,但由于 SHA 的长度为 40 个十六进制数字(16^40 ~ 1.4x10^48 可能性),因此它的可能性接近于零,甚至一点都不好笑。 。因此,至少在接下来的几千年里,你遇到任何问题的可能性大约为零。

双曲线示例(仅供娱乐):1 次提交/分钟(仅更改一个文件 -> 使用 3 个新 SHA(文件、提交、树)= 使用 3 个新 SHA/分钟 = ... = 使用 1.6M SHA/年 = 16 亿 Shahs / 千年 = 1x10^-37 % 每千年使用...(以 1000 个文件/提交/分钟计算,它仍然是3.6x10^-35%)

话虽这么说,如果你想清理你的历史记录,用 rebase 压缩它们可能是你最好的选择,只要确保你已经公开共享了该存储库就可以了

。可能还想在变基后进行垃圾收集以释放一些空间(不过,请确保变基首先正常工作,并且您可能需要告诉它收集所有内容,否则默认情况下它不会收集任何晚于两周的内容)。

Well the "upper limit" would likely be the point at which a SHA1 collision occurs, but since the SHAs are 40 hexadecimal digits long (16^40 ~ 1.4x10^48 possibilities), it's so close to zero possibility that it's not even funny. So there's roughly a zero percent chance you'll have any problems for at least the next several millennia.

Hyperbolic Example (just for fun): 1 commit/minute (just changing one file -> three new SHAs used (file, commit, tree) = 3 new shas used / minute = ... = 1.6M shas used / year = 1.6 Billion shahs / millennia = 1x10^-37 % used each millenia... (at 1000 files/commmit/min, it's still 3.6x10^-35%)

That being said, if you want to clean up your history, squashing them down with rebase is probably your best bet. Just make sure you understand the implications if you've shared the repo publicly at all.

You might also want to garbage collect after rebasing to free up some space (make sure the rebase worked right first, though, and you might need to tell it to collect everything or it will, by default, not collect anything newer than two-weeks old).

无敌元气妹 2024-12-10 13:51:10

我很确定您根本不必担心:)

Git 使用 SHA-1 哈希来检查文件,发生哈希冲突的概率接近于零。所以玩得开心吧!

我个人每天大约进行 30 次提交,没有出现任何问题。

但要避免对二进制文件进行版本控制:)它确实很重。

I'm pretty sure you don't have to worry at all :)

Git is using SHA-1 hash in order to check files, the probability of having an hash conflict is near zero. So have fun !!

I personally did around 30 commits a day without issue.

But avoid versioning binaries files :) it's really heavy for what it is.

究竟谁懂我的在乎 2024-12-10 13:51:10

我认为 git 可以处理的提交数量没有严格的限制,只有你个人可以消化的数量。对于较大的项目和多个开发人员,您将看到比您自己产生的活动更多的活动。

如果您愿意,您可以保留每周合并的辅助分支,但 git 永远不会关心您有多少提交。只要你能理解自己在做什么,就可以疯狂。您始终可以比较多个提交,或者使用 bisect 等工具来找出历史问题。

I think there is no strong limit to the number of commits git can handle, only what you can personally digest. With larger projects and multiple developers, you'll see more activity than you would ever generate on your own.

You can keep a secondary branch that you merge to every week if you wish, but git will never care about how many commits you have. Go crazy as long as you can understand what you're doing. You can always diff several commits back or use tools like bisect to figure out history problems.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文