在 Java 中使用 BigDecimal 代替 double 所带来的权衡
我正在为神经网络编写代码,我想知道是否应该这样做。实际上,我有点担心即使使用 double 也可能不会产生好的结果,并且我可能必须将代码迁移到更高效的语言,例如 c++。我在这里读到一个问题,BigDecimal 比 double 慢 1000 倍?就这么多了。
另一方面,我将大量使用十进制数字,并且让它更精确总是好的。我也无法确定精度是否会导致问题。我认为我见过的任何实现都没有做到这一点,所以我可能不会这样做。尽管有时网络行为不正常;这是精度错误还是逻辑问题,我不确定。
但我想知道,你们在处理金钱时只使用 BigDecimal 吗?对此有什么想法吗?
I'm writing code for a neural network, and I'm wondering whether I should do it or not. I'm actually somewhat worried that even using double might not wield good results and that I might have to migrate my code to a more efficient language, like c++. I've read in a question here that BigDecimal is 1000 times slower than double? That's a lot.
On the other hand, I'm going to be working a lot with decimal numbers and having it be more precise would always be good. I can't really tell if the precision could cause problems to it, either. I don't think any of the implementations I've seen around do it either, so I'm probably not gonna do it. Although sometimes the network doesn't behave as it should; whether that's a precision error or a problem with its logic, I'm not sure.
But I'm wondering, do you guys only use BigDecimal when dealing with money? Any thoughts about this?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(6)
在神经网络中使用 Java 的 double 数据类型作为权重似乎非常合适。它是工程和科学应用的良好选择。
神经网络本质上是近似的。除了性能影响之外,
BigDecimal
的精度在此应用程序中毫无意义。主要为金融应用保留BigDecimal
。Using Java's
double
data type for weights in a neural network seems very appropriate. It is a good choice for engineering and scientific applications.Neural networks are inherently approximate. The precision of
BigDecimal
would be meaningless in this application, performance impact aside. ReserveBigDecimal
primarily for financial applications.我在处理金钱时使用整数/长整型,因为使用任何类型的十进制表示都是荒谬的。你绝对不应该使用双打,并且你可能想看看一些货币处理库。
然而,据我记得,货币库还不成熟或不发达。
I use integers/longs when dealing with money, because using any sort of decimal representation is absurd. You should DEFINITELY not use doubles, and there are some money handling libraries out there you may want to look at.
As I recall, however, the money libraries are immature or underdeveloped.
整数和小数值的整数与货币相结合是可行的方法。要么找一个库,要么自己写一个。
Integers for whole and fractional values, combined with Currency, are the way to go. Either find a library or write your own.
您绝对不应该将浮点十进制数用于定点金额 - 例如货币。
过去,我使用了一个自定义 Money 类,它仅包装了一个 BigDecimal 实例 - 它运行良好并且没有任何问题。
You should absolutely not use floating point decimal numbers for fixed point amounts - such as currency.
In the past I've used a custom Money class which merely wraps a BigDecimal instance - it has worked well and has no issues.
人们不仅仅使用
BigDecimal
/BigInteger
来赚钱。相反,他们在需要比使用 double 或 long 更高精度的应用程序中使用它们。当然,使用 BigDecimal 和 BigInteger 的代价是算术运算速度慢得多。例如,大数加法为
O(N)
,其中N
是数字中有效位数,乘法为O(N**2)
。因此,决定是否使用 long 或 double 或其“大”类似物的方法是查看您的应用程序真正需要多少精度。货币应用程序确实需要能够在不损失一分钱的情况下代表价值。其他应用对精度同样敏感。
但坦率地说,我不认为神经网络应用程序需要 13 位小数的精度。您的网络未按其应有的方式运行的原因可能与精度无关。在我看来,这更可能与“真正的”神经网络并不总是按照其应有的方式运行这一事实有关。
People don't just use
BigDecimal
/BigInteger
for money. Rather, they use them in applications that need more precision than is available usingdouble
orlong
.Of course, using
BigDecimal
andBigInteger
comes at the cost of much slower arithmetical operations. For example, big number addition isO(N)
whereN
is the number of significant digits in the number, and multiplication isO(N**2)
.So the way to decide whether to use
long
/double
or their "big" analogs is to look at how much precision your application really needs. Money applications really do need to be able to represent values without losing a single cent. Other applications are equally sensitive to precision.But frankly, I don't think that a neural network application needs 13 decimal digits of precision. The reason your network is not behaving as it should is probably nothing to do with precision. IMO, it is more likely related to the fact that "real" neural networks don't always behave the way that they should.
做你自己的基准并据此做出决定......这对“人们所说的”毫无意义。
Do your own benchmarks and decide based on that.... it means nothing "what people say".