(redirected from Eigenvectors)
Also found in: Dictionary.


A nonzero vector v whose direction is not changed by a given linear transformation T ; that is, T (v) = λ v for some scalar λ. Also known as characteristic vector.
McGraw-Hill Dictionary of Scientific & Technical Terms, 6E, Copyright © 2003 by The McGraw-Hill Companies, Inc.
The following article is from The Great Soviet Encyclopedia (1979). It might be outdated or ideologically biased.



(or characteristic vector). An eigenvector of a linear transformation is a vector that does not change direction under the transformation and is simply multiplied by a scalar. For example, the eigenvectors of a transformation composed of rotations about some axis and of contraction toward the plane perpendicular to the axis are vectors directed along the axis.

The coordinates x1x2,..., xn of the eigenvectors of a transformation of n-dimensional space with the matrix ║aik║ satisfy the system of homogeneous linear equations

where λ is an eigenvalue of the matrix. If the matrix of a transformation is Hermitian, then the eigenvectors are mutually perpendicular. As a result of a Hermitian transformation, a sphere becomes an ellipsoid whose major axes are eigenvectors of the transformation.

The Great Soviet Encyclopedia, 3rd Edition (1970-1979). © 2010 The Gale Group, Inc. All rights reserved.


A vector which, when acted on by a particular linear transformation, produces a scalar multiple of the original vector. The scalar in question is called the eigenvalue corresponding to this eigenvector.

It should be noted that "vector" here means "element of a vector space" which can include many mathematical entities. Ordinary vectors are elements of a vector space, and multiplication by a matrix is a linear transformation on them; smooth functions "are vectors", and many partial differential operators are linear transformations on the space of such functions; quantum-mechanical states "are vectors", and observables are linear transformations on the state space.

An important theorem says, roughly, that certain linear transformations have enough eigenvectors that they form a basis of the whole vector states. This is why Fourier analysis works, and why in quantum mechanics every state is a superposition of eigenstates of observables.

An eigenvector is a (representative member of a) fixed point of the map on the projective plane induced by a linear map.
This article is provided by FOLDOC - Free Online Dictionary of Computing (foldoc.org)
References in periodicals archive ?
[R.sub.A] and [R.sub.B] are the transmit correlation matrix for user A and B respectively, which can be decomposed as [R.sub.k] = [v.sub.k][[LAMBDA].sub.k][v.sup.H.sub.k],k=A,B,[v.sub.k] [member of] [C.sup.MxM] is a unitary matrix whose columns are eigenvectors of [R.sup.k], and the diagonal [[LAMBDA].sup.k] that contains the eigenvalues of [R.sup.k] is normalized as Tr([[LAMBDA].sup.k]) = M.
More precisely, they are the eigenvectors associated with the zero eigenvalues of (4.3) on a floating subdomains.
Let V be a matrix of [L.sup.-1]'s sequence of eigenvectors ([V.sub.n]) and D a diagonal matrix having [L.sup.-1]'s sequence of eigenvalues ([[lambda].sub.n]), at the diagonal.
It is well known that if A is a Riesz-spectral operator, then it can be represented as an infinite sum of all its eigenvectors. However, as declared in Section 1 for nonautonomous system (1), we assume that D = D(A(t)) is independent of t.
When the eigenvalues of G are arranged from large to small, and the orthogonal standard eigenvectors corresponding to the first d eigenvalues are as follows:
Determine the eigenvectors of nonnegative components for eigenvalues [[lambda].sub.2] = 0 and [[lambda].sub.3] = 3.
The variances can be equalized by using a whitening transmission P within space that the eigenvectors span in U0 such that P equals
The classification results of the control group and treatment group were determined using the first eigenvector of the LDA algorithm and the MPA and MNA feature values.
In |t> representation, the operator of energy becomes i[??]([partial derivative]/[partial derivative]t), while its eigenvectors |E> become [e.sup.(1/i[??])Et], for every E [member of] R.
These properties are rank of the matrices A, B and C and eigenvectors of these matrices which corespond to the egenvalue zero.