Also found in: Dictionary, Thesaurus, Medical, Financial, Acronyms, Wikipedia.
American Standard Code for Information Interchange,a set of codes used to represent letters, numbers, a few symbols, and control characters. Originally designed for teletype operations, it has found wide application in computerscomputer,
device capable of performing a series of arithmetic or logical operations. A computer is distinguished from a calculating machine, such as an electronic calculator, by being able to store a computer program (so that it can repeat its operations and make logical
..... Click the link for more information. . A seven-digit (or seven-bit) binary number (see binary systembinary system,
numeration system based on powers of 2, in contrast to the familiar decimal system, which is based on powers of 10. In the binary system, only the digits 0 and 1 are used.
..... Click the link for more information. ) can represent one of 128 distinct codes. Thus, in decimal equivalents, the series "72, 69, 76, 76, 79" represents the letters "h, e, l, l, o" in ASCII. With the introduction of its personal computer in 1981, the International Business Machines Company (IBM) increased the number of available characters to 256 by using an eight-bit byte. This IBM-extended ASCII set has become a de facto standard. However, the inability of US-ASCII to correctly represent many other languages became an obvious and intolerable misfeature as computer use outside the United States and United Kingdom increased. As a consequence, national extensions to US-ASCII were developed that were incompatible with one another. This in turn led to the standardization of 16-bit (or "double-byte") and 32-byte character sets, such as UnicodeUnicode
, set of codes used to represent letters, numbers, control characters, and the like, designed for use internationally in computers. It has been expanded to include such items as scientific, mathematical, and technical symbols, and even musical notation.
..... Click the link for more information. , that could accommodate large numbers of linguistic and other symbols.
American standard code for information interchange: a computer code for representing alphanumeric characters
ASCII(American Standard Code for Information Interchange) Pronounced "ask-ee," it is the built-in binary code for representing characters in all computers except IBM mainframes, which use the EBCDIC coding system. ASCII was originally developed for communications and uses only seven bits per character, providing 128 combinations that include upper and lower case alphabetic letters, the numeric digits and special symbols such as the $ and %. The first 32 characters are set aside for communications and printer control (see ASCII chart).
A Byte Holds ASCII and Then Some
Since the common storage unit in a computer is an 8-bit byte (256 character combinations) and ASCII uses only the first 128 (0-127), the second set of 128 characters (128-255) are technically not ASCII, but are typically used for foreign language and math symbols. In the first PCs running DOS, they also contained elementary graphics symbols. In the Mac, the additional values can be defined by the user.
ASCII vs. Hex
In technical applications typically used by developers, you may have a choice between entering data in ASCII or "hex" for editing or searching. ASCII is entered by typing in regular text, but because there are not enough keys on the keyboard to enter 256 distinct characters, the hexadecimal (hex) numbering system is used. Hex is entered by typing only the digits 0 to 9 or the letters A to F, and it provides a precise way of defining any of the 256 possible combinations in the byte, whether they be control codes (0-31) or the last 128 (128-255). See hex chart, ASCII file and Unicode.