用马丁·伊斯特伍德(Martin Eastwood

发布于 2025-01-29 12:59:00 字数 1389 浏览 3 评论 0 原文

我很难执行scipy.optimize。

z =(x^w1/(x^w2+y^w3)) * w4 * 17 (我们得到 16 而不是 17 强>
而x [3],x [4],x [16],x [18]居住在公式中)

我的数据集(17/12/12 preml.ge)

x=np.array([33,43,28,26,28,30,26,24,15,21,23,28,19,18,19,22,15,19,18,15])
y=np.array([15,24,17,16,21,25,22,21,13,20,23,29,25,24,26,32,24,31,32,30])
z=np.array([36,42,29,24,27,29,23,27,24,23,22,20,25,16,17,15,18, 9,15,10])
data=np.array([x, y, z])

十年前, ,马丁·伊斯特伍德(Martin Eastwood)(一个发烧友博客)找到

W1 = 1.122777,W2 = 1.072388,W3 = 1.127248,W4 = 2.499973
其中rmse = 3.657522858用于我的问题。

我想知道的是,我可以使用哪种方法来获取这些 w 参数,例如上述估计。
我阅读了这些答案,但是这种方法似乎并不容易跟踪我。我需要你的帮助。
添加。另一个问题,我们如何估计 w 参数 均针对{x_i,y_i,z_i}而不是上述{x_i,y_i,z_i}而不是整个{x,y,z}?

I have difficulty performing scipy.optimize.minimize with Martin Eastwood's interpolation formula—

z=(x^w1/(x^w2+y^w3))*w4 * 17 (we get 16 instead of 17
while x[3], x[4], x[16], x[18] reside in the formula)

My data set (17/12/12 preml.ge)

x=np.array([33,43,28,26,28,30,26,24,15,21,23,28,19,18,19,22,15,19,18,15])
y=np.array([15,24,17,16,21,25,22,21,13,20,23,29,25,24,26,32,24,31,32,30])
z=np.array([36,42,29,24,27,29,23,27,24,23,22,20,25,16,17,15,18, 9,15,10])
data=np.array([x, y, z])

Ten years ago, Martin Eastwood (an enthusiast blogger) found:

w1=1.122777, w2=1.072388, w3=1.127248, w4=2.499973
where RMSE=3.657522858 for my problem.

What I want to know is which approach I could use to get these wparameters, like those, for above dependent estimation.
I read these answers, but the method seems not easy to trace to me. I need your help.
Added. A further question, how can we estimate wparameters intended for each set {x_i, y_i, z_i} instead of whole {x, y, z} as above?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

心头的小情儿 2025-02-05 12:59:00

使用最小二乘更好,因为该方法可以单独看到每个样本的变化,而不是最终总和。

import scipy.optimize
import numpy as np
import matplotlib.pyplot
x=np.array([33,43,28,26,28,30,26,24,15,21,23,28,19,18,19,22,15,19,18,15])
y=np.array([15,24,17,16,21,25,22,21,13,20,23,29,25,24,26,32,24,31,32,30])
z=np.array([36,42,29,24,27,29,23,27,24,23,22,20,25,16,17,15,18, 9,15,10])

pred = lambda w: (x**w[0]/(x**w[1]+y**w[2]))*w[3
w_given = 1.122777, 1.072388, 1.127248, 2.499973]
w,_ = scipy.optimize.leastsq(lambda w: (z - pred(w)), (1,1,1,1))
w_guided,_ = scipy.optimize.leastsq(lambda w: (z - pred(w)), w_given)

让我们可视化

plt.plot(z, pred(w), '.')
# 17 introduced here arbitrarily
plt.plot(z, pred(w_given)*17, '+')
plt.plot(z, pred(w_guided), '+')
plt.plot(np.sort(z), np.sort(z), '--')
plt.legend(['dumb guess', 'given w (scaled)', 'init with given w', 'target'])

检查是否比初始猜测(理智检查)获得更好的结果

(np.mean((z - pred(w))**2), 
 np.mean((z - pred(w_guided))**2), 
 np.mean((z - pred(w_given)*17)**2), 
 np.mean((z - pred(w_given)*16)**2))
(10.987132120174204,
 10.987132121290418,
 15.064715846376691,
 17.341093598858798)

Using least-squares is better because the method sees variation of each sample individually instead of only the final sum.

import scipy.optimize
import numpy as np
import matplotlib.pyplot
x=np.array([33,43,28,26,28,30,26,24,15,21,23,28,19,18,19,22,15,19,18,15])
y=np.array([15,24,17,16,21,25,22,21,13,20,23,29,25,24,26,32,24,31,32,30])
z=np.array([36,42,29,24,27,29,23,27,24,23,22,20,25,16,17,15,18, 9,15,10])

pred = lambda w: (x**w[0]/(x**w[1]+y**w[2]))*w[3
w_given = 1.122777, 1.072388, 1.127248, 2.499973]
w,_ = scipy.optimize.leastsq(lambda w: (z - pred(w)), (1,1,1,1))
w_guided,_ = scipy.optimize.leastsq(lambda w: (z - pred(w)), w_given)

Let's visualize

plt.plot(z, pred(w), '.')
# 17 introduced here arbitrarily
plt.plot(z, pred(w_given)*17, '+')
plt.plot(z, pred(w_guided), '+')
plt.plot(np.sort(z), np.sort(z), '--')
plt.legend(['dumb guess', 'given w (scaled)', 'init with given w', 'target'])

Checking if the fit gave better results than the initial guess (sanity check)

(np.mean((z - pred(w))**2), 
 np.mean((z - pred(w_guided))**2), 
 np.mean((z - pred(w_given)*17)**2), 
 np.mean((z - pred(w_given)*16)**2))
(10.987132120174204,
 10.987132121290418,
 15.064715846376691,
 17.341093598858798)
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文