optim 的 lnsrch 问题 - R、BFGS
我正在尝试使用 optim 来拟合 BFGS (和 L-BFGS-B)的非线性最小二乘问题。当我提供解析梯度时,线性搜索异常终止,并且最终解总是非常接近起点。然而,当我不提供梯度时,它似乎收敛得很好。这是否向任何人暗示了任何数字问题?我很确定渐变是正确的。这可能是一个缩放问题吗? 感谢您的任何帮助。
I'm trying to fit a nonlinear least squares problem with BFGS (and L-BFGS-B) using optim. When I supply the analytical gradients, the linesearch terminates abnormally, and the final solution is always very close to the starting point. However, when I don't supply the gradients, it seems to converge fine. Does this suggest any numerical problems to anybody? I'm pretty sure the gradients are correct. Could it be a scaling issue?
Thanks for any help.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
您非常确定渐变是正确的。你证明了吗?您是否通过有限差分计算了梯度并查看它们是否与解析梯度大致相同?我想那是第一个要看的地方。我也不得不做同样的事情。
PS您是否考虑过Metropolis-Hastings?它缓慢但稳健,并且不需要梯度或 Hessian。
You're pretty sure the gradients are right. Have you proved it? Have you calculated the gradients by finite difference and seen if they are about the same as the analytic gradients? That's the first place to look, I would think. I've had to do the same thing.
P.S. Have you considered Metropolis-Hastings? It's slow but robust, and no need for gradients or the Hessian.