Binary Digit


Also found in: Dictionary, Thesaurus, Medical, Legal, Financial, Acronyms, Wikipedia.
Related to Binary Digit: byte

binary digit

[′bīn·ə·rē ′dij·ət]
(computer science)
bit

Binary Digit

 

(in information theory), a unit used to measure entropy and the quantity of information. An entropy of 1 binary digit (1 bit) has a source with two equiprobable messages. The term is derived from the fact that the number of binary digits determines (to an accuracy of 1) the average number of characters required to record messages from a given source in the binary code. Decimal digits (decit) are also used. The conversion from one digit to another corresponds to the change in the base of logarithms when the entropy and the quantity of information are being determined (10 instead of 2). The conversion formula is 1 decit = 1/log 2 bits ≈ 3.32 bits.

References in periodicals archive ?
The second person was also attached to an EEG amplifier and their PC would pick up the stream of binary digits and flash an LED lamp at two different frequencies, one for zero and the other one for one.
In other words, we need 7,372,800 binary digits for each frame of video.
The key length is expressed as the number of binary digits required to store the key.
The UUEncode software converts the item to binary digits and re-converts it at the other end.
Digital TV converts material into binary digits, which can be crammed together to allow far more channels into a smaller space.