ASCII(redirected from American Standard Code for Information Interchange.)
Also found in: Dictionary, Thesaurus, Medical, Financial.
Related to American Standard Code for Information Interchange.: Extended Binary Coded Decimal Interchange Code
ASCII or American Standard Code for Information Interchange, a set of codes used to represent letters, numbers, a few symbols, and control characters. Originally designed for teletype operations, it has found wide application in computers. A seven-digit (or seven-bit) binary number (see binary system) can represent one of 128 distinct codes. Thus, in decimal equivalents, the series “72, 69, 76, 76, 79” represents the letters “h, e, l, l, o” in ASCII. With the introduction of its personal computer in 1981, the International Business Machines Company (IBM) increased the number of available characters to 256 by using an eight-bit byte. This IBM-extended ASCII set has become a de facto standard. However, the inability of US-ASCII to correctly represent many other languages became an obvious and intolerable misfeature as computer use outside the United States and United Kingdom increased. As a consequence, national extensions to US-ASCII were developed that were incompatible with one another. This in turn led to the standardization of 16-bit (or “double-byte”) and 32-byte character sets, such as Unicode, that could accommodate large numbers of linguistic and other symbols.
The Columbia Electronic Encyclopedia™ Copyright © 2022, Columbia University Press. Licensed from Columbia University Press. All rights reserved.
McGraw-Hill Dictionary of Scientific & Technical Terms, 6E, Copyright © 2003 by The McGraw-Hill Companies, Inc.
American standard code for information interchange: a computer code for representing alphanumeric characters
Collins Discovery Encyclopedia, 1st edition © HarperCollins Publishers 2005
This article is provided by FOLDOC - Free Online Dictionary of Computing (foldoc.org)
ASCII(American Standard Code for Information Interchange) Pronounced "ask-ee," ASCII is the built-in binary code for representing characters in all computers except IBM mainframes, which use EBCDIC coding. ASCII was originally developed for communications and uses seven bits per character, providing 128 combinations that include upper and lower case alphabetic letters, the numeric digits and special symbols such as the $ and %. The first 32 characters are format codes (tab, return, etc.) as well as control codes for communications and printers (see below). See ASCII file, ASCII ribbon campaign, EBCDIC and Unicode.
ASCII vs. Hex
In technical editors used by developers, there is a choice between entering data in ASCII or hexadecimal ("hex"). ASCII is all characters, but hex is limited (0 to 9 and A to F). See hex chart and hex editor.
|A Byte Holds ASCII and Then Some|
|The common storage unit in a computer is an 8-bit byte that holds 256 character combinations (0-255). However, ASCII uses only the first 128 (0-127), and the rest (128-255) are foreign language and math symbols.|
Copyright © 1981-2019 by The Computer Language Company Inc. All Rights reserved. THIS DEFINITION IS FOR PERSONAL USE ONLY. All other reproduction is strictly prohibited without permission from the publisher.