在Scipy最小化2次迭代之间执行了多少个功能评估
我正在使用Scipy最小化功能,主要是BFGS方法。我需要找到在迭代2之间进行2个功能评估。此功能评估通常旨在计算数值衍生物。
如果可以找到在迭代之间计算了多少梯度评估,那就更好了。
代码的示例:
def callback_params(theta):
global params
params = np.vstack((params, theta))
def rosen(X):
return (1.0 - X[0])**2 + 100.0 * (X[1] - X[0]**2)**2 + \
(1.0 - X[1])**2 + 100.0 * (X[2] - X[1]**2)**2
init = np.random.rand(3)
params = np.empty([0, 3])
res = minimize(rosen,init, method='BFGS',options = {'disp': True}, callback=callback_params)
我如何知道 params 中2行之间的函数评估数?
I am using scipy minimize function, mostly the BFGS method. I need to find how many function evaluation were executed between 2 following iterations. This function evaluation usually aim to calculate numerical derivatives.
If it is possible to find how many gradient evaluation were calculated between iterations, that would be even better.
Example of code:
def callback_params(theta):
global params
params = np.vstack((params, theta))
def rosen(X):
return (1.0 - X[0])**2 + 100.0 * (X[1] - X[0]**2)**2 + \
(1.0 - X[1])**2 + 100.0 * (X[2] - X[1]**2)**2
init = np.random.rand(3)
params = np.empty([0, 3])
res = minimize(rosen,init, method='BFGS',options = {'disp': True}, callback=callback_params)
How can I know the number of function evaluation between 2 rows in params?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
功能 返回
OptimizerEsult
其中一个成员在哪里The function
scipy.optimize.minimize
returns anOptimizeResult
where one of the members is您可以利用
scipy.optimize.minimize
在 每次迭代之后调用传递的回调:然后,
d ['obj_evals']
包含一个事实。每次迭代的目标函数评估数量。可以轻松地扩展此想法以计算您将梯度传递给最小化
的梯度评估的数量。You can exploit the fact that
scipy.optimize.minimize
calls the passed callback after each iteration:Then,
d['obj_evals']
contains the number of objective function evaluations for each iteration. This idea can easily be extended to count the number of gradient evaluations given you pass the gradient tominimize
as well.