redundancy bit

redundancy bit

[ri′dən·dən·sē ‚bit]
(computer science)
A bit which carries no information but which is added to the information-carrying bits of a character or stream of characters to determine their accuracy.
References in periodicals archive ?
The reason being for BPSK one symbol per sample with less redundancy bits needs more power rather than QPSK having two symbols per sample with more redundancy bits.
In general, CRC can detect burst errors up to length < number of redundancy bits. However, CRC (polynomial codes) take high processing time to calculate some function y = f(m), where m is the message data, for coding and decoding.
Finally, convolutional coding on the bit stream uses redundancy bits so that a decoder can detect errors in the bit stream and correct them.