在Scipy最小化2次迭代之间执行了多少个功能评估

发布于 2025-02-05 08:06:36 字数 551 浏览 4 评论 0原文

我正在使用Scipy最小化功能,主要是BFGS方法。我需要找到在迭代2之间进行2个功能评估。此功能评估通常旨在计算数值衍生物。

如果可以找到在迭代之间计算了多少梯度评估,那就更好了。

代码的示例:

def callback_params(theta):
    global params
    params = np.vstack((params, theta))

def rosen(X):
    return (1.0 - X[0])**2 + 100.0 * (X[1] - X[0]**2)**2 + \
           (1.0 - X[1])**2 + 100.0 * (X[2] - X[1]**2)**2

init = np.random.rand(3)
params = np.empty([0, 3])

res = minimize(rosen,init, method='BFGS',options = {'disp': True}, callback=callback_params)

我如何知道 params 中2行之间的函数评估数?

I am using scipy minimize function, mostly the BFGS method. I need to find how many function evaluation were executed between 2 following iterations. This function evaluation usually aim to calculate numerical derivatives.

If it is possible to find how many gradient evaluation were calculated between iterations, that would be even better.

Example of code:

def callback_params(theta):
    global params
    params = np.vstack((params, theta))

def rosen(X):
    return (1.0 - X[0])**2 + 100.0 * (X[1] - X[0]**2)**2 + \
           (1.0 - X[1])**2 + 100.0 * (X[2] - X[1]**2)**2

init = np.random.rand(3)
params = np.empty([0, 3])

res = minimize(rosen,init, method='BFGS',options = {'disp': True}, callback=callback_params)

How can I know the number of function evaluation between 2 rows in params?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

又爬满兰若 2025-02-12 08:06:37

功能 返回 OptimizerEsult 其中一个成员在哪里

nfevnjevnjevint
目标函数及其雅各布和黑森州的评估数量。

The function scipy.optimize.minimize returns an OptimizeResult where one of the members is

nfev, njev, njev: int
Number of evaluations of the objective functions and of its Jacobian and Hessian.

遮云壑 2025-02-12 08:06:36

您可以利用scipy.optimize.minimize在 每次迭代之后调用传递的回调

def callback(xk, d):
    d['obj_evals'].append(0)
    d['iters'] += 1

def rosen(X, d):
    d['obj_evals'][d['iters']] += 1    
    return (1.0 - X[0])**2 + 100.0 * (X[1] - X[0]**2)**2 + \
           (1.0 - X[1])**2 + 100.0 * (X[2] - X[1]**2)**2

init = np.random.rand(3)

d = {'obj_evals': [0], 'iters': 0}
obj_fun = lambda x: rosen(x, d)
cb = lambda xk: callback(xk, d)

res = minimize(obj_fun, init, method='BFGS',options = {'disp': True}, callback=cb)

然后,d ['obj_evals']包含一个事实。每次迭代的目标函数评估数量。可以轻松地扩展此想法以计算您将梯度传递给最小化的梯度评估的数量。

You can exploit the fact that scipy.optimize.minimize calls the passed callback after each iteration:

def callback(xk, d):
    d['obj_evals'].append(0)
    d['iters'] += 1

def rosen(X, d):
    d['obj_evals'][d['iters']] += 1    
    return (1.0 - X[0])**2 + 100.0 * (X[1] - X[0]**2)**2 + \
           (1.0 - X[1])**2 + 100.0 * (X[2] - X[1]**2)**2

init = np.random.rand(3)

d = {'obj_evals': [0], 'iters': 0}
obj_fun = lambda x: rosen(x, d)
cb = lambda xk: callback(xk, d)

res = minimize(obj_fun, init, method='BFGS',options = {'disp': True}, callback=cb)

Then, d['obj_evals'] contains the number of objective function evaluations for each iteration. This idea can easily be extended to count the number of gradient evaluations given you pass the gradient to minimize as well.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文