计算 2x2 张量的特征向量/值
I'm implementing the system described within this paper, and I'm getting a little stuck. I only recently encountered tensors/eigenvalues etc so excuse me if this is a little simple!
Given a 2x2 tensor, how can I calculate the major and minor eigenvectors of it?
Bonus points for implementations which are easy to translate into C# ;)
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
(我使用 MATLAB 矩阵表示法;分号表示“新行”。)
[u;v] 是 [ab; 的特征向量。 cd] 与特征值 t if [ab; cd] [u;v] = t[u;v]。这意味着 au+bv=tu 且 cu+bv=tv;即,(at)u+bv=0 且 cu+(dt)v=0。如果这有任何非平凡的解,那是因为除了常数因子之外这两个方程是相同的;那么你的特征向量是 [u;v] = [b;ta]。 (当然,只有在常数因子范围内才是唯一的。)
特征值是可能的 t 值;也就是说,除了常数因子之外,这两个方程是相同的。这会发生当且仅当矩阵 [at b; c dt] 是奇异的,这意味着它的行列式为零,这意味着 (at)(dt)-bc=0,这意味着 t^2 - (a+d)t + (ad-bc) = 0。
所以:该方程的特征值,然后按照我上面描述的方式获得特征向量。
特征值可能是复数(例如,对于旋转)。在这种情况下,您也会得到复杂的特征向量,这是正确的,因为您不能让 Av=kv 且 A,v 为实数,但 k 不是实数。
两个警告。
某些矩阵有两个相等的特征值。它们可能有也可能没有两个独立的特征向量。如果矩阵是单位矩阵的常数倍——形式为 [k 0; 0 k] - 那么它确实有两个独立的特征向量,事实上任何两个独立的向量都可以,因为一切都是特征向量。否则,只有一个特征向量,任何依赖于两个特征向量的定理或算法都可能失败。例如,对于 [1 k; 形式的“剪切”矩阵,就会发生这种情况。 0 1]。
在高于 2 的维度中,您不希望这样做。有更好(但更复杂)的方法来计算特征向量和特征值,并且无法获得“正确”数量的独立特征向量的方法也更为广泛。
(I'm using MATLAB matrix notation; semicolons mean "new row".)
[u;v] is an eigenvector of [a b; c d] with eigenvalue t if [a b; c d] [u;v] = t[u;v]. That means that au+bv=tu and cu+bv=tv; that is, (a-t)u+bv=0 and cu+(d-t)v=0. If this has any nontrivial solutions, it's because those two equations are the same apart from a constant factor; then your eigenvector is [u;v] = [b;t-a]. (Unique only up to a constant factor, of course.)
The eigenvalues are the values of t for which this is possible; that is, for which those two equations are the same apart from a constant factor. That happens iff the matrix [a-t b; c d-t] is singular, which means its determinant is zero, which means (a-t)(d-t)-bc=0, which means t^2 - (a+d)t + (ad-bc) = 0.
So: Solve that equation for the eigenvalues, and then get the eigenvectors the way I described above.
The eigenvalues may be complex (e.g., for a rotation). In that case you'll get complex eigenvectors too, which is correct because you can't have Av=kv with A,v real but k not real.
Two warnings.
Some matrices have two equal eigenvalues. They may or may not have two independent eigenvectors. If the matrix is a constant multiple of the identity matrix -- of the form [k 0; 0 k] -- then it does have two independent eigenvectors, and in fact any two independent vectors will do, because everything is an eigenvector. Otherwise, there is only one eigenvector and any theorems or algorithms that rely on having two are liable to fail. This will happen, e.g., with "shear" matrices of the form [1 k; 0 1].
In dimensions higher than 2, you don't want to do things this way. There are much better (but more complicated) ways of computing eigenvectors and eigenvalues, and the ways in which you can fail to have the "right" number of independent eigenvectors are more extensive.