低阶矩阵的计算
假设我有一个 3x3 A 的矩阵 3。 我想创建一个 Rank 2 矩阵,它在 $ {l}_{2} $ / Frobenius 范数中最接近 A。 我们称这个矩阵为F。
通过SVD很容易实现,即如果$A=US{V}^{H}$通过SVD分解$F=U\hat{S}{V}^{H}$ 。 其中 $ \hat{S} $ 与 $ S $ 相同,最后一个奇异值为零。
问题是,是否有一种计算强度较小但使用 SVD 分解来创建 F 的方法?
谢谢。
Given I have a matrix 3x3 A of rank 3.
I want to create a matrix of Rank 2 which is closest to A in the $ {l}_{2} $ / Frobenius norm.
Let's call this matrix F.
Is easy to achieve by the SVD, namely, if $ A = U S {V}^{H} $ by the SVD decomposition $ F = U \hat{S} {V}^{H} $.
Where $ \hat{S} $ is the same as $ S $ with the last Singular Value zeroed.
The question is, is there a less computationally intensive method to create F but using the SVD decomposition?
Thanks.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
如果您知道矩阵的秩为 3,那么恰好 3 个 Householder 变换就足以将 nxm 矩阵缩减为 3xm 矩阵。现在可以轻松地将其转换为特征值问题。计算特征多项式。例如,考虑这个矩阵(我将使用 MATLAB 来完成这项工作):
显然,B 的排名为 3,但如果您不相信我,排名证实了这一说法。
因此 B 是一个 10x6 矩阵,秩为 3。SVD 也证实了这一事实。
我懒得写 Householder 转换。 (我已经准备了一些代码,但是......)所以我将使用 QR 来帮助我。
这里的乘法显示了我们在三次 Householder 转换后会看到的情况。看到它是我们所期望的上三角形。接下来,计算特征多项式。我将在这里象征性地使用我自己的工具,但计算只是一些代数。
P 的根是多少,C*C' 的特征值是多少?
我们知道这里的特征值必须是奇异值的平方,所以这里一个好的测试是看看我们是否恢复了 svd 找到的奇异值。所以再次扩展显示格式,我们看到它做得很好。
当数学发挥作用时,它会很有趣。我们可以用这些信息做什么?如果我知道特定的特征值,我可以计算相应的特征向量。本质上,我们需要求解线性 3x3 齐次方程组。
同样,我会在这里懒惰地找到解决方案,而不实际编写任何代码。高斯消去法就可以了。
所以我们有 V,C*C' 的特征向量。我们可以通过以下对 svd 的调用来说服自己这就是其中之一。
因此,通过减去 C 在 V 方向上的分量,我们得到一个 2 阶矩阵。
类似地,我们可以将 V 变换到原始问题空间,并使用它通过 B 的秩一更新来变换矩阵 B(我们的原始矩阵)。D
的秩是多少?
正如你所看到的,尽管我做了很多数学运算,多次调用 svd、QR、rank 等,但最终实际的计算还是相对微不足道的。我只需要...
只要我们知道实际的秩是 3,所有这些计算步骤对于任何大小的矩阵来说都是快速且高效的。甚至不值得在这个主题上写一篇论文。
编辑:
由于问题已被修改,矩阵本身的大小仅为 3x3,因此计算更加简单。不过,我将保留帖子的第一部分,因为它描述了对于任意大小的 3 阶通用矩阵的完全有效的解决方案。
如果您的目标是消除 3x3 矩阵的最后一个奇异值, 3x3 上的 svd 看起来效率很高。通过间接方式生成最后一个奇异值也会有一些精度损失。例如,比较 svd 计算出的最小奇异值,然后使用特征值。所以你可能会在这里看到一些小错误,因为形成 A'*A 会导致一些精度。这种损失的程度可能取决于 A 的条件数。
但是,如果您确实想自己做这项工作,您可以做。
与简单调用 svd,然后进行一级更新相比,这是否更简单或计算效率更低?我完全不确定在 3x3 上付出努力是否值得。 3x3 svd 的计算速度确实非常快。
If you KNOW the matrix to be rank 3, then exactly 3 Householder transformations will be sufficient to reduce an nxm matrix to a 3xm matrix. One can now easily convert this into an eigenvalue problem. Compute the characteristic polynomial. For example, consider this matrix (I'll use MATLAB to do the work):
Clearly, B will have rank 3, but if you don't trust me, rank confirms that claim.
So B is a 10x6 matrix, with rank 3. The SVD confirms that fact also.
I'm feeling too lazy to write the Householder transformations. (I've got some code for that laying around, but...) So I'll use QR to help me out.
The multiply here shows what we would have seen after three Householder transformations. See that it is upper triangular as we would expect. Next, compute the characteristic polynomial. I'll so it symbolically here, using my own tools, but the computation is just a bit of algebra.
What are the roots of P, so the eigenvalues of C*C'?
We know that the eigenvalues must be the squares of the singular values here, so a good test here is to see if we recovered the singlular values that svd found. So expanding the display format again, we see it did so nicely.
Mathematics can be fun when it works. What can we do with this information? If I know a specific eigenvalue, I can compute the corresponding eigenvector. Essentially, we need to solve the linear 3x3 homogeneous system of equations
Again, I'll be lazy here to find the solution without actually writing any code. Gaussian elimination would do it though.
So we have V, an eigenvector of C*C'. We can convince ourselves that it is one by the following call to svd.
So by subtracting off that component of C in the direction of V, we get a rank 2 matrix.
Similarly, we can transform V into the original problem space, and use that to transform the matrix B, our original matrix, by a rank one update of B.
What is the rank of D?
As you can see, even though I did a lot of mathematics, calling svd, QR, rank, etc., all multiple times, in the end, the actual computation was relatively trivial. I needed only...
All of these computational steps would be fast and efficient for ANY size matrix, as long as we know the actual rank is 3. Not even worth a paper on the subject.
EDIT:
Since the question has been revised such that the matrix itself is only of size 3x3, the computation is even more trivial. I'll leave the first part of the post as is however, since it describes a fully valid solution for a general matrix of any size that is of rank 3.
If your goal is to kill off the last singular value of a 3x3 matrix, the svd on a 3x3 seems like it would be pretty efficient to do. There will also be some loss of precision in generating the last singular value by indirect means. For example, compare here the smallest singular value computed by svd, then by using the eigenvalues. So you may see some small error here, since forming A'*A will cause some of precision. The extent of this loss will probably depend on the condition number of A.
However, if you really want to do the work yourself, you could do it.
Is this any simpler or less computationally efficient than a simple call to the svd, followed by a rank one update? I'm not at all sure it is worth the effort on a 3x3. A 3x3 svd really is extremely fast to compute.
您可能已经考虑到了这一点,但如果 A 是正规的,则可以通过特征值分解来计算 SVD。这相当于求解特征多项式,并且由于它是 3 阶矩阵的 3 次多项式,因此可以通过众所周知的三次公式找到根。
看起来 SVD 通常必须可以简化为求解 3 阶矩阵的三次方,但我不记得读过任何相关内容。快速谷歌浏览了一下这段代码 声称以这种方式解决了 3 阶 SVD。不幸的是,没有附带文件。如果使用此代码,使用非正定矩阵对其进行测试应该是一个很好的测试用例。
编辑:在第二次阅读时,作者似乎也使用了特征分解。可能不适用于非 PD 矩阵,但我想在这里被证明是错误的。
You probably already considered that, but if A is normal, the SVD can be computed by eigenvalue decomposition. This amounts to solving the characteristic polynomial, and, since it is a degree 3 polynomial for a rank 3 matrix, the roots are found by well-known cubics formulae.
It also looks like SVD in general must be reducible to solving cubics for a matrix of rank 3, but I cannot remember reading anything on that. A quick google-around turned this piece of code that claims to solve the rank 3 SVD in this manner. There is no accompanying paper, unfortunately. If using this code, testing it with a non-positive-definite matrix should be a good test case.
Edit: On the second reading, it seems that the author there using eigendecomposition as well. Likely not good on non-PD matrices, but I'd like to be proven wrong here.
您可以使用幂迭代来查找与最大奇异值对应的奇异向量。一旦掌握了这一点,您就可以使用幂迭代的修改版本来查找第二大奇异值的向量;每次迭代后,您需要减去部分解向量在最大奇异值向量上的投影。
对于查找所有奇异向量来说,这是一个很糟糕的方法,但对于前两个向量来说应该可以正常工作。
You can use power iteration to find the singular vectors corresponding to the largest singular value. Once you have that, you can use a modified version of power iteration to find the vectors for the second largest singular value; after each iteration you'll need to subtract off the projection of the partial solution vector onto the vectors for the largest singular value.
This is a poor approach for finding all the singular vectors, but should work just fine for the first two.