Matlab 与 Mathematica,特征向量?
function H = calcHyperlinkMatrix(M)
[r c] = size(M);
H = zeros(r,c);
for i=1:r,
for j=1:c,
if (M(j,i) == 1)
colsum = sum(M,2);
H(i,j) = 1 / colsum(j);
end;
end;
end;
H
function V = pageRank(M)
[V D] = eigs(M,1);
V
function R = google(links)
R = pageRank(calcHyperlinkMatrix(links));
R
M=[[0 1 1 0 0 0 0 0];[0 0 0 1 0 0 0 0];[0 1 0 0 1 0 0 0];[0 1 0 0 1 1 0 0];
[0 0 0 0 0 1 1 1];[0 0 0 0 0 0 0 1];[1 0 0 0 1 0 0 1];[0 0 0 0 0 1 1 0];]
google(M)
ans =
-0.1400
-0.1576
-0.0700
-0.1576
-0.2276
-0.4727
-0.4201
-0.6886
Mathematica:
calculateHyperlinkMatrix[linkMatrix_] := {
{r, c} = Dimensions[linkMatrix];
H = Table[0, {a, 1, r}, {b, 1, c}];
For[i = 1, i < r + 1, i++,
For[j = 1, j < c + 1, j++,
If[linkMatrix[[j, i]] == 1, H[[i, j]] = 1/Total[linkMatrix[[j]]],
0]
]
];
H
}
H = {{0, 0, 0, 0, 0, 0, 1/3, 0}, {1/2, 0, 1/2, 1/3, 0, 0, 0, 0}, {1/2,
0, 0, 0, 0, 0, 0, 0}, {0, 1, 0, 0, 0, 0, 0, 0}, {0, 0, 1/2, 1/3,
0, 0, 1/3, 0}, {0, 0, 0, 1/3, 1/3, 0, 0, 1/2}, {0, 0, 0, 0, 1/3,
0, 0, 1/2}, {0, 0, 0, 0, 1/3, 1, 1/3, 0}};
R = Eigensystem[H];
VR = {R[[1, 1]], R[[2, 1]]}
PageRank = VR[[2]]
{1, {12/59, 27/118, 6/59, 27/118, 39/118, 81/118, 36/59, 1}}
Matlab 和 Mathematica 没有给出与特征值 1 相同的特征向量。尽管两者都有效...哪一个是正确的,为什么它们不同?如何生成特征值为 1 的所有特征向量?
function H = calcHyperlinkMatrix(M)
[r c] = size(M);
H = zeros(r,c);
for i=1:r,
for j=1:c,
if (M(j,i) == 1)
colsum = sum(M,2);
H(i,j) = 1 / colsum(j);
end;
end;
end;
H
function V = pageRank(M)
[V D] = eigs(M,1);
V
function R = google(links)
R = pageRank(calcHyperlinkMatrix(links));
R
M=[[0 1 1 0 0 0 0 0];[0 0 0 1 0 0 0 0];[0 1 0 0 1 0 0 0];[0 1 0 0 1 1 0 0];
[0 0 0 0 0 1 1 1];[0 0 0 0 0 0 0 1];[1 0 0 0 1 0 0 1];[0 0 0 0 0 1 1 0];]
google(M)
ans =
-0.1400
-0.1576
-0.0700
-0.1576
-0.2276
-0.4727
-0.4201
-0.6886
Mathematica:
calculateHyperlinkMatrix[linkMatrix_] := {
{r, c} = Dimensions[linkMatrix];
H = Table[0, {a, 1, r}, {b, 1, c}];
For[i = 1, i < r + 1, i++,
For[j = 1, j < c + 1, j++,
If[linkMatrix[[j, i]] == 1, H[[i, j]] = 1/Total[linkMatrix[[j]]],
0]
]
];
H
}
H = {{0, 0, 0, 0, 0, 0, 1/3, 0}, {1/2, 0, 1/2, 1/3, 0, 0, 0, 0}, {1/2,
0, 0, 0, 0, 0, 0, 0}, {0, 1, 0, 0, 0, 0, 0, 0}, {0, 0, 1/2, 1/3,
0, 0, 1/3, 0}, {0, 0, 0, 1/3, 1/3, 0, 0, 1/2}, {0, 0, 0, 0, 1/3,
0, 0, 1/2}, {0, 0, 0, 0, 1/3, 1, 1/3, 0}};
R = Eigensystem[H];
VR = {R[[1, 1]], R[[2, 1]]}
PageRank = VR[[2]]
{1, {12/59, 27/118, 6/59, 27/118, 39/118, 81/118, 36/59, 1}}
Matlab and Mathematica doesn't give the same eigenvector with the eigenvalue 1. Both works though...which one is correct and why are they different? How do I gte all eigenvectors with the eigenvalue 1?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
特征向量
X
的定义是一些向量X< /code> 满足
AX = kX,
其中
A
是矩阵,k
是常数。从定义中可以清楚地看出,cX
也是任何不等于0
的c
的特征向量。因此存在一些常量c
,使得X_matlab = cX_mathematica
。看起来第一个是正常的(欧几里德长度为 1,即将坐标的平方和相加,然后取平方根,您将得到 1),第二个被标准化,因此最终坐标为 1(任何特征向量都是找到,然后将所有坐标除以最终坐标)。
如果您只需要一个特征向量,您可以使用任何您想要的。
The Definition of an Eigenvector
X
is some vectorX
that satisfiesAX = kX
where
A
is a matrix andk
is a constant. It is pretty clear from the definition thatcX
is also an Eigenvector for anyc
not equal to0
. So there is some constantc
such thatX_matlab = cX_mathematica
.It looks like the first is normal (has Euclidean length 1, i.e. add the sums of the squares of the coordinates then take the square root and you will get 1) and the second is normalised so that the final coordinate is 1 (any Eigenvector was found and then all coordinates were divided by the final coordinate).
You can use whichever one you want, if all you need is an Eigenvector.
这是因为如果向量 x 是矩阵 H 的特征向量,那么 x 的任意倍数也是矩阵 H 的特征向量。
您引用的向量作为 matlab 的答案并没有完全检查:
但假设它足够接近,您会看到它
由几乎相同的元素组成。因此,matlab 的向量是 Mathematica 的 -1.45 倍。
This is because if a vector x is an eigenvector of matrix H, so is any multiple of the x.
Vector you quote as an answer for matlab does not quite check:
But assuming it is close enough, you see that
consists of almost the same elements. Thus matlab's vector is -1.45 multiple of Mathematica's.
特征向量不一定是唯一的。特征向量的所有要求是,
m ≠ n
,它必须具有单位范数v_m*v_n=0
(正交性)Av_m=u_m v_m
,其中u_m
是相应的特征值返回的确切特征向量取决于所实现的算法。作为一个简单的例子来证明一个矩阵可以有两组不同的特征向量,考虑一个 NxN 单位矩阵:
很明显(并且可以很容易地确认)I 的每一列code> 是一个特征向量,特征值均为 1。
我现在声明,以下向量
m=1,2...,N
形成范数为 1 的正交基组,因此是I
的特征向量。这里1i
指的是 MATLAB 表示法中-1
的平方根。您可以自己验证这一点:这里我取了实部,因为由于机器精度的影响,虚部不为零(阶
10^-16
),但应该为零(您甚至可以通过分析来做到这一点,并且它应该为零)。否则,imagesc
返回错误。所以,综上所述,特征向量不一定是唯一的,并且都传达相同的信息;只是以不同的表现形式。
Eigenvectors are not necessarily unique. All that is required of an eigenvector is that
v_m*v_n=0
for allm ≠ n
(orthogonality)Av_m=u_m v_m
, whereu_m
is the corresponding eigenvalueThe exact eigenvectors returned depends on the algorithm implemented. As a simple example to demonstrate that one matrix can have two different sets of eigenvectors, consider an
NxN
identity matrix:It is obvious (and can be easily confirmed) that each column of
I
is an eigenvector and the eigenvalues are all 1.I now state that the following vectors
for
m=1,2...,N
form an orthogonal basis set with norm 1, and hence are the eigenvectors ofI
. Here1i
refers to square root of-1
in MATLAB notation. You can verify this for yourself:Here I've taken the real part because the imaginary part is non-zero (of order
10^-16
)due to machine precision effects, but should be zero (you can even do this analytically and it should be zero).imagesc
returns an error otherwise.So, to sum up, eigenvectors are not necessarily unique and both convey the same information; just in different representations.