16-bit computing

16-bit computing

CPUs that process 16 bits as a single unit, compared to 8, 32 or 64. The first personal computers in the late 1970s used 8-bit CPUs but migrated to 16 bits with the IBM PC in 1981. In the mid-1980s, PCs jumped to 32 bits with the Intel 386, and the Mac debuted with the 32-bit Motorola 68000 CPU. See 8088, 386 and 68000.

The 16-bit CPUs are still used as embedded processors in myriad products that do not require the higher speed. However, over the years, a lot of design effort went into 32-bit CPUs, making them faster, more efficient, smaller and less expensive and competitive with 16-bit CPUs for numerous embedded applications. See 8-bit computing, 32-bit computing, 64-bit computing, 128-bit computing and bit specifications.


Copyright © 1981-2025 by The Computer Language Company Inc. All Rights reserved. THIS DEFINITION IS FOR PERSONAL USE ONLY. All other reproduction is strictly prohibited without permission from the publisher.
Mentioned in
Copyright © 2003-2025 Farlex, Inc Disclaimer
All content on this website, including dictionary, thesaurus, literature, geography, and other reference data is for informational purposes only. This information should not be considered complete, up to date, and is not intended to be used in place of a visit, consultation, or advice of a legal, medical, or any other professional.