16-bit computing

(redirected from 16-bit processing)

16-bit computing

CPUs that process 16 bits as a single unit, compared to 8, 32 or 64. The first personal computers in the late 1970s used 8-bit CPUs but migrated to 16 bits with the IBM PC in 1981. In the mid-1980s, PCs jumped to 32 bits with the Intel 386, and the Mac debuted with the 32-bit Motorola 68000 CPU. See 8088, 386 and 68000.

The 16-bit CPUs are still used as embedded processors in myriad products that do not require the higher speed. However, over the years, a lot of design effort went into 32-bit CPUs, making them faster, more efficient, smaller and less expensive and competitive with 16-bit CPUs for numerous embedded applications. See 8-bit computing, 32-bit computing, 64-bit computing, 128-bit computing and bit specifications.


Copyright © 1981-2019 by The Computer Language Company Inc. All Rights reserved. THIS DEFINITION IS FOR PERSONAL USE ONLY. All other reproduction is strictly prohibited without permission from the publisher.
References in periodicals archive ?
State-of-the-art DSPs - referred to as "vecom" chips in Europe have faster processing speeds than ever and 16-bit processing power, which has led to finer control algorithms.