如何计算上三角矩阵的 SVD(奇异值分解)
您知道使用 BLAS 或 LAPACK 计算 SVD 的算法吗?
假设我有一个对称矩阵 A:
1 22 13 14
22 1 45 24
13 45 1 34
14 24 34 1
从 A 得到上三角矩阵 G 后:
1 22 13 14
0 1 45 24
0 0 1 34
0 0 0 1
- 如何计算 A 的 SVD,但使用 G 的值?
- 我是否必须通过所有矩阵 A 还是足以通过 G(中间矩阵)?
事实上,我在处理 G 矩阵后得到,但由于它是对称的,我如何计算对称 A 的 SVD,只有 G(换句话说,只有 A 的上三角矩阵)?
Do you know an algorithm that calculates SVD using BLAS or LAPACK?
Lets say I have a symmetric Matrix A:
1 22 13 14
22 1 45 24
13 45 1 34
14 24 34 1
After I get the upper triangular Matrix G from A:
1 22 13 14
0 1 45 24
0 0 1 34
0 0 0 1
- How do I calculate SVD of A, but with the values of G?
- Do I have to pass all of matrix A or is sufficient to pass G (the middle matrix)?
In fact, I get after processing G matrix, but as its symmetric, how do I compute SVD of symmetric A, having only G (in other words, only having A´s upper triangular matrix)?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
如果无法访问矩阵中的所有值,则无法计算矩阵的 SVD(即,无法仅根据上三角形进行计算)。
要了解这一点,请查看矩阵的 SVD:
或者,更一般地说,采用矩阵的 SVD:
对于不同的 x 值。观察它们是不同的,并得出结论:您不能仅根据上三角形来计算 SVD。
编辑:
Alberto 正确地观察到提问者可能正在使用对称(或埃尔米特)矩阵,对于该矩阵,完全可以仅基于上三角形来计算 SVD。
终于有机会回到这一点了:人们通常不会对对称矩阵执行 SVD,因为 SVD 过于笼统。对称矩阵的所有特征值都是实数,并且特征向量形成正交基,因此“SVD”实际上只是通常的特征分解。您想要使用的确切 LAPACK 例程因矩阵存储的具体情况而有所不同。 Intel 保留了关于 LAPACK 例程的良好参考;您可能会找到他们的决策树 对于对称特征值问题的 LAPACK 例程很有用。
You cannot compute the SVD of a matrix without having access to all of the values in the matrix (i.e. you can't do it based solely on the upper triangle).
To see this, look at the SVD of the matrices:
Or, more generally, take the SVD of the matrix:
for various values of x. Observe that they are different, and conclude that you can't compute an SVD based solely on the upper triangle.
Edit:
Alberto correctly observes that the questioner may be working with symmetric (or hermitian) matrices, for which it is absolutely possible to compute the SVD based solely on the upper triangle.
Finally had a chance to get back to this: One generally doesn't perform a SVD for symmetric matrices, because the SVD is over-general. All the eigenvalues of a symmetric matrix are real and the eigenvectors form an orthonomal basis, so the "SVD" is really just the usual eigen-decomposition. The exact LAPACK routines that you want to use vary somewhat with the specifics of matrix storage. Intel maintains a good reference on the LAPACK routines; you may find their decision-tree for LAPACK routines for symmetric eigenproblems useful.