32-bit computing

32-bit computing

CPUs that process 32 bits as a single unit, compared to 8, 16 or 64. Although 32-bit CPUs were used in mainframes as early as the 1960s, personal computers began to migrate from 16 to 32 bits in the 1980s. Starting with the first 32-bit 386 chips in 1985, Intel x86 CPUs were built with a 16-bit mode for compatibility with 16-bit applications (see 386).

The 32-bit mode does not result in two times as much real work getting done as in 16-bit mode, because it relates to only one aspect of internal processing. The CPU's clock speed, along with the speed, size and architecture of the disks, memory and peripheral bus all play important roles in a computer's performance (see throughput). See 8-bit computing, 16-bit computing, 64-bit computing, 128-bit computing and bit specifications.


Copyright © 1981-2025 by The Computer Language Company Inc. All Rights reserved. THIS DEFINITION IS FOR PERSONAL USE ONLY. All other reproduction is strictly prohibited without permission from the publisher.
Mentioned in
Copyright © 2003-2025 Farlex, Inc Disclaimer
All content on this website, including dictionary, thesaurus, literature, geography, and other reference data is for informational purposes only. This information should not be considered complete, up to date, and is not intended to be used in place of a visit, consultation, or advice of a legal, medical, or any other professional.