(redirected from Eigenvalues)
Also found in: Dictionary, Thesaurus, Medical.

Eigenvalue (quantum mechanics)

If an equation containing a variable parameter possesses nontrivial solutions only for certain special values of the parameter, these solutions are called eigenfunctions and the special values are called eigenvalues.

The eigenfunction-eigenvalue relation is of particular importance in quantum mechanics because of its prominence in the equations which relate the mathematical formalism of the theory with physical results. See Quantum mechanics



(or characteristic value). An eigenvalue of a linear transformation or operator A is a number λ for which there exists a nonzero vector x such that Ax = λx; the vector x is called an eigenvector, or characteristic vector. Thus, the eigenvalues of a differential operator L(y) with given boundary conditions are numbers X for which the equation L(y) = λy has a nonzero solution that satisfies the boundary conditions. For example, if the operator L(y) has the form -y”, then numbers of the form λn = n2, where n is a natural number, are eigenvalues of the operator under the boundary conditions y (0) = y (π) = 0, since the functions yn = sin nx satisfy the equation - yn = n2y with the indicated boundary conditions. If, however, λ + n2 for any natural n, then only the function y(x) = 0 satisfies the equation –y” = λy under the same boundary conditions. Eigenvalues of linear operators are of importance in many problems in mathematics, mechanics, and physics—in problems, for example, in analytic geometry, algebra, the theory of vibrations, and quantum mechanics.

The eigenvalues of the matrix A = ║aik║, where i, k = 1, 2, . . . , n, are the eigenvalues of the linear transformation on an n-dimensional complex space that corresponds to A. The eigenvalues can also be defined as the roots of the equation det(A - λE) = 0, where E is the unit matrix—that is, the roots of the equation

which is called the characteristic equation of the matrix. Since these numbers are the same for the similar matrices A and B’XAB where B is a nonsingular matrix, they characterize properties of the linear transformation that are independent of the choice of coordinate system. To each root λ, of equation (*) there corresponds a vector xi ≠ 0 (an eigenvector) such that Axi = λixi. If all the eigenvalues are distinct, then the set of eigenvectors may be chosen as the basis of the vector space. With respect to this basis, the linear transformation is described by the diagonal matrix

Every matrix A with distinct eigenvalues can be represented in the form C–1ʌC. If A is a Hermitian matrix, then its eigenvalues are real, the eigenvectors are orthogonal, and there exists a unitary matrix that can be chosen as C. The absolute value of every eigenvalue of a unitary matrix is equal to 1. The sum of the eigenvalues of a matrix is equal to the sum of its diagonal elements—that is, to the trace of the matrix. Knowledge of the eigenvalues of a matrix plays an important role in the investigation of the convergence of certain approximate methods of solving systems of linear equations.


The one of the scalars λ such that T (v) = λ v, where T is a linear operator on a vector space, and v is an eigenvector. Also known as characteristic number; characteristic root; characteristic value; latent root; proper value.


The factor by which a linear transformation multiplies one of its eigenvectors.
References in periodicals archive ?
The eigenvalues of T are {0, [+ or -]i[square root of g + eh], - [[theta].
Table 1 Total Variance Explained by Different Components Component Initial Eigenvalues Total % of Cumulative Variance % 1 9.
22] is ill-conditioned with respect to inversion, then the eigenvalues of the pencil [lambda][[?
Principal Component Analysis Results Principal Eigenvalue Difference Contribution Ratio Component (%) PC1 6.
Moreover, we have concluded that in order to give better predictions using manipulated data, the available information has to be manipulated in such a way that the corresponding Leslie matrix has complex eigenvalues whose magnitudes are very close to the magnitude of the unique largest real eigenvalue.
The variables also can be transformed to principal components, where those with small eigenvalues can be eliminated, but the larger question is whether these principal components are interpretable.
These are both real symmetric matrices, so they are diagonalizable, with real eigenvalues, and eigenspaces with different eigenvalues are orthogonal (Godsil and Royle, 2001, [section]8.
0 can perform new types of analyses, including response spectrum, complex eigenvalues, and pre-stressed normal modes, as well as material and geometric non-linear implicit analysis.
The shape used was taken from the first canonical variate axis which has the highest percentage of eigenvalue.
At first, we claim that it is impossible for three eigenvalues of A to be distinct.
d] is an n x n diagonal matrix having the eigenvalues [[lambda].