返回介绍

Norms and Distance of Vectors

发布于 2025-02-25 23:43:41 字数 3436 浏览 0 评论 0 收藏 0

Recall that the ‘norm’ of a vector \(v\), denoted \(||v||\) is simply its length. For a vector with components

\[v = \left(v_1,...,v_n\right)\]

the norm of \(v\) is given by:

\[||v|| = \sqrt{v_1^2+...+v_n^2}\]

The distance between two vectors is the length of their difference:

\[d(v,w) = ||v-w||\]

import numpy as np
from scipy import linalg


# norm of a vector

v = np.array([1,2])
linalg.norm(v)
2.2361
# distance between two vectors

w = np.array([1,1])
linalg.norm(v-w)
1.0000

Inner Products

Inner products are closely related to norms and distance. The (standard) inner product of two \(n\) dimensional vectors \(v\) and \(w\) is given by:

\[\begin{split}<v,w> = v_1w_1+...+v_nw_n\end{split}\]

I.e. the inner product is just the sum of the product of the components. Certain ‘special’ matrices also define inner products, and we will see some of those later.

Any inner product determines a norm via:

\[\begin{split}||v|| = <v,v>^{\frac12}\end{split}\]

There is a more abstract formulation of an inner product, that is useful when considering more general vector spaces, especially function vector spaces:

An inner product on a vector space \(V\) is a symmetric, positive definite, bilinear form.

There is also a more abstract definition of a norm - a norm is function from a vector space to the real numbers, that is positive definite, absolutely scalable and satisfies the triangle inequality.

What is important here is not to memorize these definitions - just to realize that ‘norm’ and ‘inner product’ can be defined for things that are not tuples in \(\mathbb{R}^n\). (In particular, they can be defined on vector spaces of functions).

Example

v.dot(w)
3

Outer Products

Note that the inner product is just matrix multiplication of a \(1\times n\) vector with an \(n\times 1\) vector. In fact, we may write:

\[\begin{split}<v,w> = v^tw\end{split}\]

The outer product of two vectors is just the opposite. It is given by:

\[v\otimes w = vw^t\]

Note that I am considering \(v\) and \(w\) as column vectors. The result of the inner product is a scalar. The result of the outer product is a matrix.

Example

np.outer(v,w)
array([[1, 1],
       [2, 2]])

Extended example : the covariance matrix is an outer proudct.

import numpy as np

# We have n observations of p variables
n, p = 10, 4
v = np.random.random((p,n))
# The covariance matrix is a p by p matrix
np.cov(v)
array([[ 0.1055, -0.0437,  0.0352, -0.0152],
       [-0.0437,  0.055 , -0.0126,  0.0324],
       [ 0.0352, -0.0126,  0.1016,  0.0552],
       [-0.0152,  0.0324,  0.0552,  0.1224]])
# From the definition, the covariance matrix
# is just the outer product of the normalized
# matrix where every variable has zero mean
# divided by the number of degrees of freedom
w = v - v.mean(1)[:, np.newaxis]
w.dot(w.T)/(n - 1)
array([[ 0.1055, -0.0437,  0.0352, -0.0152],
       [-0.0437,  0.055 , -0.0126,  0.0324],
       [ 0.0352, -0.0126,  0.1016,  0.0552],
       [-0.0152,  0.0324,  0.0552,  0.1224]])

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。
列表为空,暂无数据
    我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
    原文