为什么二进制可执行文件的大小增加了相同代码块的不同量?

发布于 2025-02-03 11:59:49 字数 258 浏览 6 评论 0原文

假设我有一个C ++程序,我使用G ++进行了编译。然后,我得到一个可执行文件,例如,该文件的大小为100 kb。然后,我添加了几行C ++代码,然后再次编译,可执行文件的大小增加到101 kb。然后,我添加完全相同的 C ++代码的块,并第三次编译。这次可执行文件已增加到106 KB。为什么相同的代码有时会增加可执行文件的大小,而另一个时间会增加呢?

同样,只有每隔几次就会发生大幅度增长,大多数时候它会增加, small ,数量。

Let's say I have a C++ program and I compile it using g++. I then get an executable file which, as an example, has a size of 100 kb. I then add a couple of lines of C++ code and compile again and the size of the executable has increased to 101 kb. Then I add the exact same block of C++ code and compile a third time. This time the executable has increased to 106 kb. Why does it happen that the same code sometimes increases the size of the executable by one amount, and another time something much greater?

Also the big increase only happens every couple of times, most of the time it increases by the same, small, amount.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

那些过往 2025-02-10 11:59:49

导致二进制的尺寸变化与代码尺寸变化不线性的原因有多种。如果启用了某种优化,则尤其如此。

即使在调试模式下(无优化),以下内容也可能导致这种情况:

  • 二进制中的代码大小通常需要对齐到一定尺寸(取决于硬件)。大小只能在比对的倍数中增长。
  • 同样适用于元数据表(重新定位表,调试信息)
  • 编译器仅根据
  • 与某些编译器一起使用的方法/变量的数量(不确定GCC)保留额外的调试信息,可以更新二进制的代码就位时,只有在完成的地方只有较小的更改,而不是在每个构建上执行完整的链接。在添加代码和构建与删除二进制文件之前,这将导致不同的二进制尺寸。

如果启用了优化,由于可能的优化策略,它会变得更加令人困惑:

  • 编译器可能会删除他发现的代码是无法实现的
  • 如果对速度进行优化,循环展开是一件好事,但仅在一定程度上, 。如果在循环中添加更多代码,则编译器可能会决定额外的代码大小不再值得速度增益。
  • 另外,其他优化只能达到一定水平,之后他们弊大于利。这甚至可以通过添加代码来导致二进制文件获得较小的

这些只是许多可能的原因,可能还有更多原因。

There are a variety of reasons why the size change of the resulting binary is not linear with the code size change. This is particularly true if some kind of optimization is enabled.

Even in debug mode (no optimizations), the following things could cause this to happen:

  • The code size in the binary typically needs to be aligned to a certain size (dependent on the hardware). The size can only grow in multiplies of the alignment.
  • The same applies for metadata tables (relocation tables, debug information)
  • The compiler reserving extra space for debug information, based just on the number of methods/variables in use
  • With some compilers (not sure about gcc), code in a binary can be updated in-place when only minor changes where done, instead of performing a full link on each build. This would result in different binary sizes when adding code and building vs. deleting the binary before each build.

If optimizations are enabled, it gets even more confusing, due to possible optimization strategies:

  • The compiler may remove code he finds is unreachable
  • If optimizing for speed, loop unrolling is a good thing to do, but only up to a certain degree. If adding more code inside the loop, the compiler might decide that the extra code size is no longer worth the speed gain.
  • Also other optimizations work only up to a certain level, after which they do more harm than good. This could even result in the binary file getting smaller by adding code.

These are just a bunch of possible reasons, there might be many more.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文