Gauss-Jordan elimination

Gauss-Jordan elimination

[¦gau̇s ¦jȯrd·ən ə‚lim·ə′nā·shən]
(mathematics)
Mentioned in ?
References in periodicals archive ?
Third, we present a parallel solving system of linear equations in GF([([2.sup.4]).sup.2])) based on Gauss-Jordan elimination, which is based on the work in [28].
The difference from usual Gauss-Jordan elimination is that the usual Gauss-Jordan elimination chooses the pivot after the elimination, while we perform the pivoting during the elimination.
Third, we present a parallel solving system of linear equations in GF([([2.sup.4]).sup.2]) based on Gauss-Jordan elimination. Via further other minor optimizations and by integrating the major improvement above, we implement our design in GF([([2.sup.4]).sup.2]) on TSMC-0.18 [micro]m standard cell CMOS ASICs.
This module is a hardware version of the Gauss-Jordan elimination procedure, which has been parallelized and designed to rapidly solve the nodal system, almost independently of reconfigurations in the network.
The most known direct methods are: Cramer, Gaussian elimination, Gauss-Jordan elimination, LU factorization, QR decomposition etc.
The decoding process in [4] is based on the well-known Gauss-Jordan elimination algorithm but it can progressively decode data on arrival of each partial data block.
Gauss-Jordan elimination produces an array containing only three unknowns.
This edition has two new appendices, a graphic calculator guide and an Excel guide; references to specific calculator and Excel steps in the appendices each time a new technology process is introduced; streamlined exposition and example discussions; a rewritten section on fitting curves to data with graphic utilities; a reorganized section on Gauss-Jordan elimination; updated and replaced real-data examples and exercises; revised and reorganized drill exercises; additional multistep applications; and added images and illustrations.
Let us also recall that there are problems where the conditioning of the lower and upper triangular matrices can have different importance: this happens, for instance, in the backward stability of Gauss-Jordan elimination, where the conditioning of the upper triangular matrix is crucial (see [10], [13] and [14]).
If we compute [T.sup.-1] by a procedure similar to Gauss-Jordan elimination, but using column elementary operations instead of row elementary operations and starting from the last row, we can easily obtain the following bound for the absolute value of [([T.sup.-1]).sub.ij] for any i [member of] {1, ..., n} and i [greater than or equal to] j:
PENA, Simultaneous backward stability of Gauss and Gauss-Jordan elimination, Numer.
Given a square matrix A, the Gauss-Jordan routines compute the inverse matrix of A, [A.sup.-1], via the Gauss-Jordan elimination algorithm with partial pivoting [Golub and van Loan 1989; Wilkinson 1961].