Shannon's Coding Theorem

Shannon’s Coding Theorem

 

a basic theorem of information theory on the transmission of signals over communication channels in the presence of noise that results in distortion.

Suppose a sequence of symbols that appear with certain probabilities is to be transmitted, there being some probability that a transmitted symbol will be distorted during transmission. The simplest method of reliably restoring the original sequence from the received sequence is to repeat each transmitted symbol many (N) times. The rate of transmission, however, will thereby be slowed by a factor of N—that is, will be made close to zero. According to Shannon’s theorem, there exists a positive number v, dependent solely on the probabilities under consideration, such that, for arbitrarily small ɛ > 0, methods of transmission at a rate v’ (v’ < v) arbitrarily close to v can be found that permit restoration of the original sequence with a probability of error of less than €. If the rate of transmission v’ is greater than v, then such methods cannot be found. The methods of transmission referred to involve the use of “noise-proof codes. The critical rate v is given by the equation Hv = C, where H is the entropy of the source in bits per symbol and C is the capacity of the channel in bits per second.