Matlab 与 Mathematica,特征向量?

发布于 2024-11-01 00:54:59 字数 1503 浏览 1 评论 0原文

function H = calcHyperlinkMatrix(M)
    [r c] = size(M);
    H = zeros(r,c);
    for i=1:r,
        for j=1:c,
            if (M(j,i) == 1)
                colsum = sum(M,2);
                H(i,j) = 1 / colsum(j);
            end;
        end;
    end;
    H     


function V = pageRank(M)
    [V D] = eigs(M,1);
    V

function R = google(links)
    R = pageRank(calcHyperlinkMatrix(links));
    R

M=[[0 1 1 0 0 0 0 0];[0 0 0 1 0 0 0 0];[0 1 0 0 1 0 0 0];[0 1 0 0 1 1 0 0];
    [0 0 0 0 0 1 1 1];[0 0 0 0 0 0 0 1];[1 0 0 0 1 0 0 1];[0 0 0 0 0 1 1 0];]
google(M)

ans =

   -0.1400
   -0.1576
   -0.0700
   -0.1576
   -0.2276
   -0.4727
   -0.4201
   -0.6886

Mathematica:

calculateHyperlinkMatrix[linkMatrix_] := {
  {r, c} = Dimensions[linkMatrix];
  H = Table[0, {a, 1, r}, {b, 1, c}];
  For[i = 1, i < r + 1, i++,
   For[j = 1, j < c + 1, j++,
    If[linkMatrix[[j, i]] == 1, H[[i, j]] = 1/Total[linkMatrix[[j]]], 
     0]
    ]
   ];
  H
  }


H = {{0, 0, 0, 0, 0, 0, 1/3, 0}, {1/2, 0, 1/2, 1/3, 0, 0, 0, 0}, {1/2,
     0, 0, 0, 0, 0, 0, 0}, {0, 1, 0, 0, 0, 0, 0, 0}, {0, 0, 1/2, 1/3, 
    0, 0, 1/3, 0}, {0, 0, 0, 1/3, 1/3, 0, 0, 1/2}, {0, 0, 0, 0, 1/3, 
    0, 0, 1/2}, {0, 0, 0, 0, 1/3, 1, 1/3, 0}};
R = Eigensystem[H];
VR = {R[[1, 1]], R[[2, 1]]}
PageRank = VR[[2]]


{1, {12/59, 27/118, 6/59, 27/118, 39/118, 81/118, 36/59, 1}}

Matlab 和 Mathematica 没有给出与特征值 1 相同的特征向量。尽管两者都有效...哪一个是正确的,为什么它们不同?如何生成特征值为 1 的所有特征向量?

function H = calcHyperlinkMatrix(M)
    [r c] = size(M);
    H = zeros(r,c);
    for i=1:r,
        for j=1:c,
            if (M(j,i) == 1)
                colsum = sum(M,2);
                H(i,j) = 1 / colsum(j);
            end;
        end;
    end;
    H     


function V = pageRank(M)
    [V D] = eigs(M,1);
    V

function R = google(links)
    R = pageRank(calcHyperlinkMatrix(links));
    R

M=[[0 1 1 0 0 0 0 0];[0 0 0 1 0 0 0 0];[0 1 0 0 1 0 0 0];[0 1 0 0 1 1 0 0];
    [0 0 0 0 0 1 1 1];[0 0 0 0 0 0 0 1];[1 0 0 0 1 0 0 1];[0 0 0 0 0 1 1 0];]
google(M)

ans =

   -0.1400
   -0.1576
   -0.0700
   -0.1576
   -0.2276
   -0.4727
   -0.4201
   -0.6886

Mathematica:

calculateHyperlinkMatrix[linkMatrix_] := {
  {r, c} = Dimensions[linkMatrix];
  H = Table[0, {a, 1, r}, {b, 1, c}];
  For[i = 1, i < r + 1, i++,
   For[j = 1, j < c + 1, j++,
    If[linkMatrix[[j, i]] == 1, H[[i, j]] = 1/Total[linkMatrix[[j]]], 
     0]
    ]
   ];
  H
  }


H = {{0, 0, 0, 0, 0, 0, 1/3, 0}, {1/2, 0, 1/2, 1/3, 0, 0, 0, 0}, {1/2,
     0, 0, 0, 0, 0, 0, 0}, {0, 1, 0, 0, 0, 0, 0, 0}, {0, 0, 1/2, 1/3, 
    0, 0, 1/3, 0}, {0, 0, 0, 1/3, 1/3, 0, 0, 1/2}, {0, 0, 0, 0, 1/3, 
    0, 0, 1/2}, {0, 0, 0, 0, 1/3, 1, 1/3, 0}};
R = Eigensystem[H];
VR = {R[[1, 1]], R[[2, 1]]}
PageRank = VR[[2]]


{1, {12/59, 27/118, 6/59, 27/118, 39/118, 81/118, 36/59, 1}}

Matlab and Mathematica doesn't give the same eigenvector with the eigenvalue 1. Both works though...which one is correct and why are they different? How do I gte all eigenvectors with the eigenvalue 1?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

熊抱啵儿 2024-11-08 00:54:59

特征向量 X的定义是一些向量X< /code> 满足

AX = kX,

其中 A 是矩阵,k 是常数。从定义中可以清楚地看出,cX 也是任何不等于 0c 的特征向量。因此存在一些常量c,使得X_matlab = cX_mathematica

看起来第一个是正常的(欧几里德长度为 1,即将坐标的平方和相加,然后取平方根,您将得到 1),第二个被标准化,因此最终坐标为 1(任何特征向量都是找到,然后将所有坐标除以最终坐标)。

如果您只需要一个特征向量,您可以使用任何您想要的。

The Definition of an Eigenvector X is some vector X that satisfies

AX = kX

where A is a matrix and k is a constant. It is pretty clear from the definition that cX is also an Eigenvector for any c not equal to 0. So there is some constant c such that X_matlab = cX_mathematica.

It looks like the first is normal (has Euclidean length 1, i.e. add the sums of the squares of the coordinates then take the square root and you will get 1) and the second is normalised so that the final coordinate is 1 (any Eigenvector was found and then all coordinates were divided by the final coordinate).

You can use whichever one you want, if all you need is an Eigenvector.

只等公子 2024-11-08 00:54:59

这是因为如果向量 x 是矩阵 H 的特征向量,那么 x 的任意倍数也是矩阵 H 的特征向量。
您引用的向量作为 matlab 的答案并没有完全检查:

In[41]:= H.matlab - matlab

Out[41]= {-0.0000333333, 0.0000666667, 0., 0., 0.0000333333, 0., \
-0.0000666667, 0.}

但假设它足够接近,您会看到它

In[43]:= {12/59, 27/118, 6/59, 27/118, 39/118, 81/118, 36/59, 
  1}/{-0.1400, -0.1576, -0.0700, -0.1576,
  -0.2276, -0.4727, -0.4201, -0.6886}

Out[43]= {-1.45278, -1.45186, -1.45278, -1.45186, -1.45215, -1.45217, \
-1.45244, -1.45222}

由几乎相同的元素组成。因此,matlab 的向量是 Mathematica 的 -1.45 倍。

This is because if a vector x is an eigenvector of matrix H, so is any multiple of the x.
Vector you quote as an answer for matlab does not quite check:

In[41]:= H.matlab - matlab

Out[41]= {-0.0000333333, 0.0000666667, 0., 0., 0.0000333333, 0., \
-0.0000666667, 0.}

But assuming it is close enough, you see that

In[43]:= {12/59, 27/118, 6/59, 27/118, 39/118, 81/118, 36/59, 
  1}/{-0.1400, -0.1576, -0.0700, -0.1576,
  -0.2276, -0.4727, -0.4201, -0.6886}

Out[43]= {-1.45278, -1.45186, -1.45278, -1.45186, -1.45215, -1.45217, \
-1.45244, -1.45222}

consists of almost the same elements. Thus matlab's vector is -1.45 multiple of Mathematica's.

做个少女永远怀春 2024-11-08 00:54:59

特征向量不一定是唯一的。特征向量的所有要求是,

  1. 对于所有 m ≠ n,它必须具有单位范数
  2. v_m*v_n=0(正交性)
  3. 它满足 Av_m=u_m v_m,其中u_m是相应的特征值

返回的确切特征向量取决于所实现的算法。作为一个简单的例子来证明一个矩阵可以有两组不同的特征向量,考虑一个 NxN 单位矩阵:

I=   1     0     0     0
     0     1     0     0
    ...   ...   ...   ...
     0     0     0     1

很明显(并且可以很容易地确认)I 的每一列code> 是一个特征向量,特征值均为 1。

我现在声明,以下向量

v_m=[1,exp(2*pi*1i*m/N),...,exp(2*pi*1i*m*(N-1)/N)]';

m=1,2...,N 形成范数为 1 的正交基组,因此是I 的特征向量。这里 1i 指的是 MATLAB 表示法中 -1 的平方根。您可以自己验证这一点:

N=50;
v=1/sqrt(N)*cumprod(repmat(exp(-1i*2*pi/N*(0:N-1)),N,1),1);
imagesc(real(v*v'));

这里我取了实部,因为由于机器精度的影响,虚部不为零(阶 10^-16),但应该为零(您甚至可以通过分析来做到这一点,并且它应该为零)。否则,imagesc 返回错误。

所以,综上所述,特征向量不一定是唯一的,并且都传达相同的信息;只是以不同的表现形式。

Eigenvectors are not necessarily unique. All that is required of an eigenvector is that

  1. It must have unit norm
  2. v_m*v_n=0 for all m ≠ n (orthogonality)
  3. It satisfies Av_m=u_m v_m, where u_m is the corresponding eigenvalue

The exact eigenvectors returned depends on the algorithm implemented. As a simple example to demonstrate that one matrix can have two different sets of eigenvectors, consider an NxN identity matrix:

I=   1     0     0     0
     0     1     0     0
    ...   ...   ...   ...
     0     0     0     1

It is obvious (and can be easily confirmed) that each column of I is an eigenvector and the eigenvalues are all 1.

I now state that the following vectors

v_m=[1,exp(2*pi*1i*m/N),...,exp(2*pi*1i*m*(N-1)/N)]';

for m=1,2...,N form an orthogonal basis set with norm 1, and hence are the eigenvectors of I. Here 1i refers to square root of -1 in MATLAB notation. You can verify this for yourself:

N=50;
v=1/sqrt(N)*cumprod(repmat(exp(-1i*2*pi/N*(0:N-1)),N,1),1);
imagesc(real(v*v'));

Here I've taken the real part because the imaginary part is non-zero (of order 10^-16)due to machine precision effects, but should be zero (you can even do this analytically and it should be zero). imagesc returns an error otherwise.

So, to sum up, eigenvectors are not necessarily unique and both convey the same information; just in different representations.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文