来自Scipy.minimise(Python)的回调梯度标准
我使用'cg'方法使用scipy.minimize
,我想在每次迭代中调用梯度标准。到目前为止,我已经能够在每次迭代中回电:
def min_method(fn, grad, x0):
all_fn = [fn(x0).item()]
def store(x): # callback function
all_fn.append(fn(x).item())
ans = minimize(fn, x0, method='CG', jac=grad, callback=store,
options={'disp':True,'gtol': 1e-06})
return ans, all_fn
如何将行添加到store()
函数以在每次迭代中获取梯度标准?
I am using scipy.minimize
with the 'CG' method and I want to callback the gradient norm at each iteration. So far, I have been able to call back the function at each iteration using this:
def min_method(fn, grad, x0):
all_fn = [fn(x0).item()]
def store(x): # callback function
all_fn.append(fn(x).item())
ans = minimize(fn, x0, method='CG', jac=grad, callback=store,
options={'disp':True,'gtol': 1e-06})
return ans, all_fn
How can I add a line to the store()
function in order to get the gradient norm at each iteration?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
您可以使用
aid_fprime
函数。 如果您想要“ CG”的Jacobian,则可能是以下内容:## Update
,我需要修改通过scipy.optimize.minimize调用的函数。
在
_minimize.py
中转到函数最小化
找到“ CG” Methode Line 673
_minimize_cg
_minimize_cg,然后在
emotizize.py 在第1681行周围替换
,
然后您可以访问回调函数中的jacobian
You could use the
approx_fprime
function. It could be something like:##Update
If you want the jacobian from the 'CG', I thing you need to modify the function that is called through the scipy.optimize.minimize.
in
_minimize.py
go to the functionminimize
then find the call to the 'CG' methode line 673
_minimize_cg
in that function in the
optimize.py
around line 1681 replacewith
then you can have access to the jacobian in the callback function