GSL/BLAS:将矩阵与逆矩阵相乘
我正在使用 GNU GSL 进行一些矩阵计算。我正在尝试将矩阵 B 与矩阵 A 的逆矩阵相乘。
现在我注意到 GSL 的 BLAS 部分有一个函数可以执行此操作,但前提是 A 是三角形。这有什么具体原因吗?另外,进行此计算的最快方法是什么?我应该使用 LU 分解来反转 A,还是有更好的方法?
FWIW,A 的形式为 P'GP,其中 P 是正规矩阵,P' 是其逆矩阵,G 是对角矩阵。
非常感谢:)
I'm using the GNU GSL to do some matrix calculations. I'm trying to multiply a matrix B with the inverse of a matrix A.
Now I've noticed that the BLAS-part of GSL has a function to do this, but only if A is triangular. Is there a specific reason to this? Also, what would be the fastest way to do this computation? Should I invert A using LU decomposition, or is there a better way?
FWIW, A has the form P'GP, where P is a normal matrix, P' is its inverse, and G is a diagonal matrix.
Thanks a bunch :)
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
简而言之:
仅支持三角矩阵的事实只是因为对于三角矩阵来说,此操作非常容易执行(称为反向替换),而 BLAS 仅提供低级函数的例程。任何更高级别的函数通常可以在 LAPACK 中找到,它在内部使用 BLAS 进行块级操作。
在您正在处理的特定情况下:
A:=P'*G*P
,如果P
是普通矩阵,则A
的逆矩阵code> 可以写成inv(A) := P'*inv(G)*P
。然后您有两个选择:inv(A)
并将其与B
相乘。inv(A)
的不同因子相乘:将B
乘以P
(左侧),然后重新缩放将结果与G
的对角线元素的倒数相乘,然后再次与P'
相乘(再次向左)。根据具体情况,您可以选择适合您的方法。无论如何,假设双精度,您将需要 dgemm(矩阵-矩阵乘法,Lvl3 BLAS)和 dscal(线缩放,Lvl 1 BLAS)。
希望这有帮助。
一个。
in a nutshell:
The fact that only triangular matrices are supported is simply because this operation is really easy to perform for a triangular matrix ( its called back-substitution) and BLAS only provides routines for low level functions. Any higher level function is usually found in LAPACK which uses BLAS internally for block-wise operations.
In the specific case you're dealing with:
A:=P'*G*P
, ifP
is a normal matrix then the inverse ofA
can be writteninv(A) := P'*inv(G)*P
. You then have two options:inv(A)
and multiply it withB
.inv(A)
: multiplyB
byP
(on the left), then rescale each line of the result with the inverse of the diagonal elements ofG
and then multiply again withP'
(left again).Depending on the specific case you can pick your method. Anyway, you'll need
dgemm
(matrix-matrix multiplications, Lvl3 BLAS) anddscal
(scaling of the lines, Lvl 1 BLAS), assuming double precision.Hope this helps.
A.
我相信 Adrien 的 BLAS 没有方阵反函数的理由是正确的。这取决于您用来优化其逆演算的矩阵。
一般来说,您可以使用 LU 分解,这对任何方阵都有效。即,类似于:
其中 A 是您想要其逆的方阵,p 是 gsl_permutation 对象(对排列矩阵进行编码的排列对象),signum 是排列的符号invA 是 A 的倒数。
既然您声明
A = P' G P
是P
法线和G
对角线,可能A
是一个正规矩阵。我已经有一段时间没有使用它们了,但是它们必须有一个因式分解定理,您可以在 GSL 参考手册的第 14 章 甚至更好,在线性代数书中。举个例子,如果您有对称正定矩阵并想要找到其逆矩阵,您可以使用针对此类矩阵进行了优化的 Cholesky 分解。然后,您可以使用 GSL 中的
gsl_linalg_cholesky_decomp()
和gsl_linalg_cholesky_invert()
函数来提高效率。我希望它有帮助!
I believe Adrien is right with the reason for BLAS not having an inverse function for square matrices. It depends on the matrix you are using to optimize the calculus of its inverse.
In general you can use the LU decomposition, which is valid for any square matrix. I. e., something like:
where A is the square matrix you want its inverse, p is a
gsl_permutation
object (the permutation object where the permutation matrix is encoded), signum is the sign of the permutation and invA is the inverse of A.Since you state that
A = P' G P
beingP
normal andG
diagonal, probablyA
is a normal matrix. I haven't used them in a while, but there must be a factorization theorem for them, which you can find in the Chapter 14 of the GSL reference manual or even better, in a linear algebra book.Just an example, if you had symmetric positive definite matrices and wanted to find its inverse, you could use the Cholesky factorization, which is optimized for that kind of matrices. Then you could use
gsl_linalg_cholesky_decomp()
andgsl_linalg_cholesky_invert()
functions in the GSL to make it efficient.I hope it helps!