为什么调试和发布模式之间会出现算术差异?如何解决这些差异?

发布于 2024-11-29 16:24:47 字数 2315 浏览 1 评论 0原文

在我的程序中,这个片段:

  trace.log(
   String.Format("a= {0:R} b= {1:R} a<b= {2}", 
    b.GetPixel(447, 517).GetBrightness(), (100F / 255F),
    b.GetPixel(447, 517).GetBrightness() < (100F / 255F))
  );

在 Debug\prog.exe 中输出:

 a= 0.392156869 b= 0.392156869 a<b= False

但在 Release\prog.exe 中输出不同的结果:

 a= 0.392156869 b= 0.392156869 a<b= True

谁能解释为什么相同的操作数给出不同的比较结果?并推荐一种补救措施,最好是程序范围内的补救措施,例如编译器开关?谢谢。

编辑:说明:以上结果来自在 Windows 资源管理器中启动 Debug\prog.exe 和 Release\prog.exe。

编辑:更多信息:从 VS 执行,“开始调试”给出 False(即准确的结果,与我们启动的 Debug\prog.exe 相同),而“开始而不调试”给出 True(即不准确的结果,与 WE 启动的 Release\prog.exe 相同

编辑:用文字替换的替代测试用例

这两种情况

  trace.log(
   String.Format("a= {0:R} b= {1:R} a<b= {2}",
    0.392156869F, 0.392156869F, 
    0.392156869F < 0.392156869F)
  );
  trace.log(
   String.Format("a= {0:R} b= {1:R} a<b= {2}",
    0.392156869F, (100F / 255F),
    0.392156869F < (100F / 255F))
  );

在调试和测试中没有显示任何差异。发布输出。此案例:

  trace.log(
   String.Format("a= {0:R} b= {1:R} a<b= {2}",
    b.GetPixel(447, 517).GetBrightness(), 0.392156869F,
    b.GetPixel(447, 517).GetBrightness() < 0.392156869F)
  );

显示与原始测试用例相同的差异(以及发布中的不准确性)

编辑:迟来的演示问题的最小测试用例

Color c = Color.FromArgb( 255, 100, 100, 100 );
trace.log(
 String.Format("a= {0} b= {1} a<b= {2}",
 c.GetBrightness(), 0.392156869F,
 c.GetBrightness() < 0.392156869F)
);

在 Debug\prog.exe 中输出正确的结果:

 a= 0.392156869 b= 0.392156869 a<b= False

但是。 Release\prog.exe 中的错误结果:

 a= 0.392156869 b= 0.392156869 a<b= True

编辑:补救措施

1)来自下面 Peter 的回答:

trace.log(
 String.Format("a= {0:R} b= {1:R} a<b= {2}",
  c.GetBrightness(), 0.392156869F, 
  c.GetBrightness().CompareTo(0.392156869F)<0)
);

2)来自提问者 ChrisJJ (更新):

float comp = c.GetBrightness();
trace.log(
 String.Format("a= {0:R} b= {1:R} a<b= {2}", 
 comp, 0.392156869F,
 comp < 0.392156869F)
);

我认为这增加了发布模式编译器错误的证据。

In my program this fragment:

  trace.log(
   String.Format("a= {0:R} b= {1:R} a<b= {2}", 
    b.GetPixel(447, 517).GetBrightness(), (100F / 255F),
    b.GetPixel(447, 517).GetBrightness() < (100F / 255F))
  );

outputs this in Debug\prog.exe:

 a= 0.392156869 b= 0.392156869 a<b= False

but this different result in Release\prog.exe:

 a= 0.392156869 b= 0.392156869 a<b= True

Can anyone explain why the same operands give a different comparision result? And recommend a remedy, ideally program-wide such as a compiler switch? Thanks.

EDIT: Clarification: the above results are from launching Debug\prog.exe and Release\prog.exe in Windows Explorer.

EDIT: Further info: executing from VS, "Start Debugging" gives False (i.e. the accurate result, same as WE-launched Debug\prog.exe), and "Start without debugging" gives True (i.e. the inaccurate result, same as WE-launched Release\prog.exe.

EDIT: Alternative test cases having literals substituted

These two cases

  trace.log(
   String.Format("a= {0:R} b= {1:R} a<b= {2}",
    0.392156869F, 0.392156869F, 
    0.392156869F < 0.392156869F)
  );
  trace.log(
   String.Format("a= {0:R} b= {1:R} a<b= {2}",
    0.392156869F, (100F / 255F),
    0.392156869F < (100F / 255F))
  );

show no discrepancy in Debug and Release output. This case:

  trace.log(
   String.Format("a= {0:R} b= {1:R} a<b= {2}",
    b.GetPixel(447, 517).GetBrightness(), 0.392156869F,
    b.GetPixel(447, 517).GetBrightness() < 0.392156869F)
  );

shows the same discrepancy (and inaccuracy in Release) as the original test case.

EDIT: Belated minimal test case demoing the problem

Color c = Color.FromArgb( 255, 100, 100, 100 );
trace.log(
 String.Format("a= {0} b= {1} a<b= {2}",
 c.GetBrightness(), 0.392156869F,
 c.GetBrightness() < 0.392156869F)
);

outputs this correct result in Debug\prog.exe:

 a= 0.392156869 b= 0.392156869 a<b= False

but this incorrect result in Release\prog.exe:

 a= 0.392156869 b= 0.392156869 a<b= True

EDIT: Remedies

1) From Peter's answer below:

trace.log(
 String.Format("a= {0:R} b= {1:R} a<b= {2}",
  c.GetBrightness(), 0.392156869F, 
  c.GetBrightness().CompareTo(0.392156869F)<0)
);

2) From ChrisJJ, the questioner (UPDATED):

float comp = c.GetBrightness();
trace.log(
 String.Format("a= {0:R} b= {1:R} a<b= {2}", 
 comp, 0.392156869F,
 comp < 0.392156869F)
);

I think this adds to the evidence for a Release-mode compiler bug.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

青春如此纠结 2024-12-06 16:24:47

这很可能是调试模式和发布模式之间不同的浮点规则的结果(请参阅此处此处)。

更新:

由于您对问题的新更新,我能够在我的计算机(64 位 Windows)上重现该问题。这种情况似乎仅在将平台设置为 X86 时才会发生; X64 和 AnyCPU 平台在发布模式下显示正确的结果。也许,当平台是 X86 时,公共语言运行时在 64 位机器中应用 X86 模拟,并且显然在比较运算符中搞砸了。

但是,我找到了一个可能的解决方法:使用 CompareTo 而不是“<”和“>”运算符,如下所示:

c.GetBrightness().CompareTo(0.392156869F)<0

在我的机器上,这将在 X86 中提供与 X64 和 AnyCPU 中相同的正确结果。

It's most likely the result of different floating point rules between debug mode and release mode (see here and here).

Update:

Thanks to your new update to the question, I was able to reproduce the problem on my machine (64-bit Windows). This appears to happen only when setting the platform to X86; the X64 and AnyCPU platforms show the correct result in release mode. Probably, when the platform is X86, the common language runtime applies X86 emulation in 64-bit machines and apparently messed up in the comparison operator.

However, I found a possible workaround: Use CompareTo instead of the "<" and ">" operators, like this:

c.GetBrightness().CompareTo(0.392156869F)<0

On my machine, this will provide the same correct results in X86 as in X64 and AnyCPU.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文