microprogramming
Also found in: Dictionary, Thesaurus, Wikipedia.
microprogramming
[¦mī·krō′prō‚gram·iŋ] (computer science)
Transformation of a computer instruction into a sequence of elementary steps (microinstructions) by which the computer hardware carries out the instruction.
McGraw-Hill Dictionary of Scientific & Technical Terms, 6E, Copyright © 2003 by The McGraw-Hill Companies, Inc.
microprogramming
This article is provided by FOLDOC - Free Online Dictionary of Computing (foldoc.org)
microcode
A set of elementary instructions in a complex instruction set computer (CISC). The microcode resides in a separate high-speed memory and functions as a translation layer between the machine instructions and the circuit level of the computer. Microcode enables the computer designer to create machine instructions without having to design electronic circuits. Writing microcode is called "microprogramming," and the microcode for a given computer is called a "microprogram."RISC computers do not use microcode, which is the reason why RISC compilers generate more instructions than CISC compilers.
Source Code to Machine Code to Microcode
When software is written, the source code is converted into machine instructions by assemblers and compilers. At execution time, the machine instructions are converted into microinstructions, and the microinstructions cause transistors to open and close in the circuits. See microinstruction, CISC and RISC.
Copyright © 1981-2019 by The Computer Language Company Inc. All Rights reserved. THIS DEFINITION IS FOR PERSONAL USE ONLY. All other reproduction is strictly prohibited without permission from the publisher.