16-bit computing

(redirected from 16 bits)

16-bit computing

CPUs that process 16 bits as a single unit, compared to 8, 32 or 64. The first personal computers in the late 1970s used 8-bit CPUs but migrated to 16 bits with the IBM PC in 1981. In the mid-1980s, PCs jumped to 32 bits with the Intel 386, and the Mac debuted with the 32-bit Motorola 68000 CPU. See 8088, 386 and 68000.

The 16-bit CPUs are still used as embedded processors in myriad products that do not require the higher speed. However, over the years, a lot of design effort went into 32-bit CPUs, making them faster, more efficient, smaller and less expensive and competitive with 16-bit CPUs for numerous embedded applications. See 8-bit computing, 32-bit computing and bit specifications.

References in periodicals archive ?
DT9816 6 inputs, 16 bits @ 50 kHz per channel$349 DT9816-A6 inputs, 16 bits @150 kHz per channel $499
The AD7621 offers designers no missing codes to 16 bits, 90-dB SNR (signal-to-noise ratio), and pin-for-pin compatibility with the AD7674 18-bit, 800-kSPS SAR ADC.
The device offers designers 16 bits of no missing codes, INL of +/- 3 LSB (max.
Using a 2-wire MicroPort(TM) digital interface, data is scanned out of the TC3405 serial data interface and the conversion rate can be scaled from 16 bits at 8 samples-second up to 10 bits at 512 samples-second.
This product will enable a new generation of data converters which provides 16 bits of resolution at a price competitive with 12-bit converters.