Also found in: Dictionary, Thesaurus, Medical, Legal, Financial, Acronyms.
binary digit[′bīn·ə·rē ′dij·ət]
(in information theory), a unit used to measure entropy and the quantity of information. An entropy of 1 binary digit (1 bit) has a source with two equiprobable messages. The term is derived from the fact that the number of binary digits determines (to an accuracy of 1) the average number of characters required to record messages from a given source in the binary code. Decimal digits (decit) are also used. The conversion from one digit to another corresponds to the change in the base of logarithms when the entropy and the quantity of information are being determined (10 instead of 2). The conversion formula is 1 decit = 1/log 2 bits ≈ 3.32 bits.