通过改变变量来最小化目标函数 - 在 Matlab 中?

发布于 2024-11-03 02:58:54 字数 737 浏览 1 评论 0原文

我有一个名为 A101x82 大小矩阵。使用此变量矩阵,我计算另外两个变量:

1) B,一个 1x1 标量,以及

2) C,一个 50x6 矩阵。

我将 1)2) 与它们的类似变量 3)4) 进行比较,它们的值是固定的:

3) D,一个 1x1 标量,

4) E,一个 50x6 矩阵。

现在,我想扰动/更改 A 矩阵的值,例如:

1) ~ 3),即 B< /code> 几乎等于 D ,并且

2) ~ 4),即 C 几乎等于E

请注意,在扰动时ABC 将会改变,但 DE 不会改变。

有什么想法如何做到这一点?谢谢!

I have a 101x82 size matrix called A. Using this variable matrix, I compute two other variables called:

1) B, a 1x1 scalar, and

2) C, a 50x6 matrix.

I compare 1) and 2) with their analogues variables 3) and 4), whose values are fixed:

3) D, a 1x1 scalar, and

4) E, a 50x6 matrix.

Now, I want to perturb/change the values of A matrix, such that:

1) ~ 3), i.e. B is nearly equal to D , and

2) ~ 4), i.e. C is nearly equal to E

Note that on perturbing A, B and C will change, but not D and E.

Any ideas how to do this? Thanks!

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

遮云壑 2024-11-10 02:58:54

我无法运行你的代码,因为它需要加载数据(我没有),并且如何计算 B 或 C 并不是立即显而易见的。

幸运的是,我也许能够回答你的问题。您正在描述一个优化问题,解决方案是使用 fminsearch(或类似的东西)。

您要做的就是定义一个返回包含两个元素的向量的函数:

y1 = (B - D)^weight1;
y2 = norm((C - E), weight2);

权重是您允许的可变性的强度(权重 = 2 通常就足够了)。
你的函数变量将是A。

I can't run your code as it's demanding to load data (which I don't have) and it's not immediatly obvious how to calculate B or C.

Fortunately I may be able to answer your problem. You're describing an optimization problem, and the solution would be to use fminsearch (or something of that variety).

What you do is define a function that returns a vector with two elements:

y1 = (B - D)^weight1;
y2 = norm((C - E), weight2);

with weight being how strong you allow for variability (weight = 2 is usually sufficient).
Your function variable would be A.

笑咖 2024-11-10 02:58:54

根据我的理解,你有一些功能。

fb(A) = B

fc(A) = C

您知道上面列出的函数吗?也就是说,您知道从 A 到其中每个函数的映射吗?
如果你想尝试优化,使 B 接近 D,你需要选择:

  1. 接近意味着什么。您可以查看 B 和 D 情况的一些向量范数,例如最小化 ||BD||^2。这个不同元素的标准平方和可能会达到目的,并且计算起来很好。
  2. 如何优化。这在很大程度上取决于您的功能,无论您想要本地还是全局 mimina 等。

所以基本上,现在我们将问题归结为最小化:

Cost = ||fb(A) - fd(A)||^2

One您当然可以做的就是计算该成本函数相对于 A 的各个元素的梯度,然后使用前向欧拉方法和合适的“时间步长”执行最小化步骤。这可能不是很快,但是通过足够小的时间步长和足够良好的函数,它至少会让你达到局部最小值。

计算梯度

grad_A(cost) = 2*||fb(A)-fd(A)||*(grad_A(fb)(A)-grad_A(fd)(A))

其中 grad_A 表示相对于A,而 grad_A(fb)(A) 表示在 A 处计算的函数 fb 相对于 A 的梯度,等等。

计算 grad_A(fb)(A) 取决于 fb 的形式,但这里有一些页面有“矩阵演算”恒等式和解释。

矩阵微积分恒等式
矩阵演算解释

然后,您只需通过前向欧拉更新对 A 执行梯度下降:

A_next = A_prev -时间步 * grad_A(成本)

From my understanding you have a few functions.

fb(A) = B

fc(A) = C

Do you know the functions listed above, that is do you know the mappings from A to each of these?
If you want to try to optimize, so that B is close to D, you need to pick:

  1. What close means. You can look at some vector norm for the B and D case, like minimizing ||B-D||^2. The standard sum of the squares of the elements of this different will probably do the trick and is computationally nice.
  2. How to optimize. This depends a lot on your functions, whether you want local or global mimina, etc.

So basically, now we've boiled the problem down to minimizing:

Cost = ||fb(A) - fd(A)||^2

One thing you can certainly do is to compute the gradient of this cost function with respect to the individual elements of A, and then perform minimization steps with forward Euler method with a suitable "time step". This might not be fast, but with small enough time step and well-behaved enough functions it will at least get you to a local minima.

Computing the gradient of this

grad_A(cost) = 2*||fb(A)-fd(A)||*(grad_A(fb)(A)-grad_A(fd)(A))

Where grad_A means gradient with respect to A, and grad_A(fb)(A) means gradient with respect to A of the function fb evaluated at A, etc.

Computing the grad_A(fb)(A) depends on the form of fb, but here are some pages have "Matrix calculus" identities and explanations.

Matrix calculus identities
Matrix calculus explanation

Then you simply perform gradient descent on A by doing forward Euler updates:

A_next = A_prev - timestep * grad_A(cost)

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文