Also found in: Dictionary, Wikipedia.
An instrument designed to measure electrical quantities. A typical multimeter can measure alternating- and direct-current potential differences (voltages), current, and resistance, with several full-scale ranges provided for each quantity. Sometimes referred to as a volt-ohm meter (VOM), it is a logical development of the electrical meter, providing a general-purpose instrument. Many kinds of special-purpose multimeters are manufactured to meet the needs of such specialists as telephone engineers and automobile mechanics testing ignition circuits. See Ammeter, Current measurement, Ohmmeter, Resistance measurement, Voltage measurement, Voltmeter
Multimeters originated when all electrical measuring instruments used analog techniques. They were generally based on a moving-coil indicator, in which a pointer moves across a graduated scale. Accuracy was typically limited to about 2%, although models achieving 0.1% were available. Analog multimeters are still preferred for some applications. For most purposes, digital instruments are now used. In these, the measured value is presented as a row of numbers in a window. Inexpensive hand-held models perform at least as well as a good analog design. High-resolution multimeters have short-term errors as low as 0.1 part per million (ppm) and drift less than 5 ppm in one year. Many digital multimeters can be commanded by, and send their indications to, computers or control equipment.