是否可以使用 hgsubversion 分步克隆?
我正在尝试使用 hgsubversion 克隆一个相当大的颠覆存储库。
hg clone --startrev 8890 svn+https://my.reposit.ory/trunk trunk_hg
大约一个小时后,clone
操作中止,并显示一条内存不足消息:
[r20097] user: description
abort: out of memory
Is it possible to specify an end revision for the clone
operation and通过pull
获取剩余的修订版本?或者以某种方式以更小的步骤分解克隆
?
I'm trying to clone a rather large subversion repository with hgsubversion.
hg clone --startrev 8890 svn+https://my.reposit.ory/trunk trunk_hg
After about an hour, the clone
operation aborts with an out of memory message:
[r20097] user: description
abort: out of memory
Is it possible to specify an end revision for the clone
operation and get the remaining revisions with a pull
? Or somehow break up the clone
in smaller steps?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(4)
您可以使用 -r 指定克隆的停止修订版,正如其他人所建议的那样。另一种选择(如果您将克隆保留在崩溃的地方)是仅在
trunk_hg
副本中运行hg pull
。您可能需要自己编辑/创建.hg/hgrc
才能添加[paths]\n default = svn+https://my.reposit.ory/trunk
,因为我认为我们在克隆过程结束时添加了它。也许在拉取之前运行hg svnrebuildmeta
只是为了防止 OOM 发生时 hgsubversion 的跟踪元数据被破坏。我希望这有帮助!
You can specify a stop revision with -r for clone, as others have suggested. Another option (if you kept the clone where things crashed) would be to just run
hg pull
in thetrunk_hg
copy. You might have to edit/create.hg/hgrc
yourself to add the[paths]\n default = svn+https://my.reposit.ory/trunk
, since I think we add that at the end of the cloning process. Maybe runhg svn rebuildmeta
before your pull just for good measure in case the tracking metadata for hgsubversion got hosed when the OOM happened.I hope this helps!
http://www.selenic.com/mercurial/hg.1.html#clone
可以尝试使用
-r
标志仅克隆特定变更集。尽管这可能适用于 hgsvn,也可能不适用于。http://www.selenic.com/mercurial/hg.1.html#clone
Could try using the
-r <revid>
flag to clone only a particular changeset. Though that may or may not work with hgsvn.推荐的方法是使用有限范围的修订进行克隆,然后拉取,我可以确认它对于几 GB 大小范围内的 svn 存储库来说可以完美地工作。
Cloning with a limited range of revisions and then pulling is the recommended method and I can confirm that it works flawlessly for svn repositories in the several GB size range.
这是克隆整个 svn 存储库的解决方法:
Here is a workaround to clone the whole svn repo: