通过分发目标文件来加速编译
我有一个项目,有多个分支,每个分支需要大约 1 小时才能在在线机器上进行编译。我需要每周多次跨分支、跨多台机器重新编译,许多其他开发人员也在办公室执行此操作。
是否可以选择夜间修订版,压缩目标文件,然后让开发人员同步到该修订版,提取目标文件。增量构建会起作用吗?下载/提取目标文件只需 3 分钟,而不是 1 小时,因此这将是一个巨大的改进。
如果可以的话,需要考虑哪些因素呢?我假设跨机器的工具链必须相同(gcc 版本、osx 版本、当然还有指令集)。
I've got a project that has multiple branches that each take about 1 hour to compile on top of the line machines. I need to recompile across branches several times a week, across several machine and many other developers are doing this in the office as well.
Is it possible to pick a nightly revision, compress the object files and simply have developers sync to that revision, extract the object files. Will incremental builds work? It only takes 3 minutes to download/extract the object files rather then 1 hour so it would be a huge improvement.
If it is possible what are the things that must be considered? I assume the toolchain must be the same across machines (gcc build, osx version, instruction set of course).
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我曾在一些公司工作过,这些公司拥有构建机器,在网络共享上放置不同分支(版本)的库集。开发环境设置为首先链接到本地对象和库(如果存在),然后再链接到这些网络共享。
开发人员拥有完整的源代码,但只需要创建增量对象/库。这允许开发更改仅应用于那些已更改的库,并且只要每个人都坚持类似的每天一次更新周期,效果就很好。在最坏的情况下,你又要重新构建完整的源代码。
I've worked at companies that had build machines place sets of libraries for different branches (versions) on network shares. Development environments were set up to link to local objects and libs first (if they exist), and these network shares second.
Developers had full source, but only needed to create incremental objects/libs. This allowed for development changes to be applied to only those changed libraries, and worked out OK as long as everyone stuck to a similar once-a-day update cycle. In the worst cases, you were back to building full source.