Orthogonality

(redirected from Orthogonal subspace)
Also found in: Dictionary, Thesaurus.
Related to Orthogonal subspace: Perpendicular subspace

orthogonality

[ȯr‚thäg·ə′nal·əd·ē]
(mathematics)
Two geometric objects have this property if they are perpendicular.
McGraw-Hill Dictionary of Scientific & Technical Terms, 6E, Copyright © 2003 by The McGraw-Hill Companies, Inc.
The following article is from The Great Soviet Encyclopedia (1979). It might be outdated or ideologically biased.

Orthogonality

 

a generalization and often a synonym of the concept of perpendicularity. If two vectors in three-dimensional space are perpendicular, their scalar product is equal to zero. This fact permits us to generalize the concept of perpendicularity by extending it to vectors in any linear space, where a scalar product is defined with the usual properties. Thus two vectors are said to be orthogonal if their scalar product is equal to zero. In particular, let us define the scalar product in the space of complex-valued functions on the interval [a, b] by the formula

where ρ(x) ≥ 0. Then, if (f, ϕ)ρ = 0, that is,

f(x) and ϕ(x) are said to be orthogonal with respect to the weight function ρ(x). Two linear subspaces are termed orthogonal if every vector in one of them is orthogonal to every vector in the other. This concept generalizes the concept of the perpendicularity of two lines or of a line and a plane in three-dimensional space but not the concept of the perpendicularity of two planes. Curves that intersect at right angles, as measured by the angle between the tangents at the point of intersection, are called orthogonal curves.

The Great Soviet Encyclopedia, 3rd Edition (1970-1979). © 2010 The Gale Group, Inc. All rights reserved.
References in periodicals archive ?
In the discussion below, it is worth mentioning that, given the direct sum: [S.sub.+] = [S.sub.-] [cross product] [S.sub.r] between two orthogonal subspaces [S.sub.-], [S.sub.r], the relationships between the associated non-Gaussian subspaces and their dimensions hold:
We now consider the case where X is the direct sum of two mutually orthogonal subspaces.
According to the subspace decomposition theory [8, 9], two orthogonal subspaces, denoted as signal-interference subspace (SIS) and noise subspace, respectively, can be deduced from the eigendecomposition of sample data variance matrix.
Last, project X onto the orthogonal subspaces [??] and [??]:

Full browser ?