manometer(redirected from manometric)
Also found in: Dictionary, Thesaurus, Medical, Wikipedia.
manometer(mənŏm`ĭtər): see pressurepressure,
in mechanics, ratio of the force acting on a surface to the area of the surface; it is thus distinct from the total force acting on a surface. A force can be applied to and sustained by a single point on a solid.
..... Click the link for more information. .
A double-leg liquid-column gage used to measure the difference between two fluid pressures. Micromanometers are precision instruments which typically measure from very low pressures to 50 mm of mercury (6.7 kilopascals). The barometer is a special case of manometer with one pressure at zero absolute. See Barometer
The various types of manometers have much in common with the U-tube manometer, which consists of a hollow tube, usually glass, a liquid partially filling the tube, and a scale to measure the height of one liquid surface with respect to the other (see illustration). If the legs of this manometer are connected to separate sources of pressure, the liquid will rise in the leg with the lower pressure and drop in the other leg. The difference between the levels is a function of the applied pressure and the specific gravity of the pressurizing and fill fluids.
A well-type manometer has one leg with a relatively small diameter, and the second leg is a reservoir. The cross-sectional area of the reservoir may be as much as 1500 times that of the vertical leg, so that the level of the reservoir does not change appreciably with a change of pressure. Mercurial barometers are commonly made as well-type manometers.
The inclined-tube manometer is used for gage pressures below 10 in. (250 mm) of water differential. The leg of the well-type manometer is inclined from the vertical to elongate the scale. Inclined double-leg U-tube manometers are also used to measure very low differential pressures. See Pressure measurement
an instrument for measuring the pressure of liquids and gases. A distinction is made among manometers for the measurement of absolute pressure, which is measured from zero (full vacuum); for the measurement of excess pressure, or the difference between absolute and atmospheric pressure, when the absolute pressure is greater; and differential manometers, for the measurement of differences between two pressures, each of which is usually different from atmospheric pressure. Pressure measurements corresponding to atmospheric pressure are made with barometers, whereas measurements of the pressure of rarefied gases are made with vacuum gauges (mainly in vacuum technology). Manometers with scales graduated in various units are used in the measurement of pressure.
The basis of the measurement system of a manometer is a sensing element, which is a primary pressure transducer. A distinction is made among liquid, piston, and spring manometers, depending on the principle of operation and the design of the sensing element. In addition, instruments are used in which the principle of operation is based on measurement of the changes in the physical properties of various materials caused by pressure.
Scaleless manometers, which produce standard pneumatic or electric output signals that are then fed into control devices for automatic regulation of various industrial processes, are used in addition to manometers with direct readout or recording of the readings. The areas of use of various types of manometers are shown in Figure 1.
The sensing element of liquid manometers is a column of liquid that balances the pressure being measured. The idea of using a liquid for the measurement of pressure was first advanced by the Italian scientist E. Torricelli (1640). The first mercury manometers were produced by the Italian mechanical engineer V. Viviani (1642) and the French scientist B. Pascal (1646). The structural design of liquid manometers is extremely varied. The main types are U-tube, well-type, and double-well manometers. Modern liquid manometers have ranges of measurement from 0.1 newton per sq m (N/m2) to 0.25 meganewton per sq m (MN/ m2), or about 0.01 mm of water (mm H2O) to 1900 mm of mercury (mm Hg); such instruments are used mainly for high-precision measurements in the laboratory. Liquid manometers used for measurement of small excess pressures and rarefactions below 5 kN/m2 (37.5 mm Hg) are called micromanometers. Manometers used for measurements within narrow ranges are filled with light liquids (water, alcohol, toluene, or silicone oils), instruments for wider measurement ranges are filled with mercury. During the measurement of pressure with an inclined-tube micromanometer, the liquid filling the reservoir is displaced into the tube and the change in the level of the liquid is compared with the scale, which is calibrated in units of pressure. The upper limit of measurement with such an instrument is no higher than 2 kN/m2 (about 200 mm of water) at the greatest angle of inclination. Accurate measurements and testing of micromanometers require the use of double-well manometers of the compensation type, in which one of the vessels is fixed, but the second may be displaced vertically to generate the column of liquid required to balance the pressure. The displacement, determined by means of a precise scale with a vernier or from gauge blocks, defines the pressure being measured. Micromanometers of the compensation type may be used for pressure measurements to 5 kN/m2 (about 500 mm H2O). In this case, the error does not exceed (2-5) × 10-3 N/m2, or (2-5) × 10-2 mm H2O.
The upper measurement limit of liquid manometers may be raised by increasing the height of the liquid column and by using a liquid of higher density. However, even the use of mercury in manometers rarely raises its upper limit above 0.25 MN/m2 (about 1900 mm Hg)—for example, in well-type manometers, in which the wide vessel is connected to a vertical tube. Liquid manometers designed for high-precision measurements are equipped with electrical or optical readout devices, and their structural design features permit the elimination of various sources of error (the effect of temperature, vibrations, capillary force, and so on). High accuracy is achieved by a double-well mercury manometer for absolute pressure measurement with a volumetric readout (Figure 2), which is used for the determination of temperature in standard gas thermometers (D. I. Mendeleev All-Union Institute of Metrology). The measurement limits of this manometer are 0-0.13 MN/m2 (0-1000 mm Hg).
To improve the performance of manometers (mainly accuracy of readings), monitoring systems are used for automatic determination of the height of the column of liquid.
The sensing element in piston-type manometers is a piston or other body that is used to balance the pressure by means of a weight or some other device for the measurement of force. A widely used type of piston manometer has an unsealed piston, which is ground in to fit into the cylinder with a small clearance and is displaced axially. The original instrument of this type was designed by the Russian scientists E. I. Parrot and H. F. E. Lenz in 1833. Piston manometers found numerous uses in the second half of the 19th century because of the studies of E. Ruchholz (Germany) and A. Amag (France), who independently proposed the use of the unsealed piston.
The basic advantage of piston manometers over liquid manometers is the ability to measure high pressures with high accuracy. A relatively small piston-type manometer (height, about 0.5 m) has a wider range of measurement and greater accuracy than the 300-m mercury manometer designed by the French scientist L. Cailleté (1891). The manometer was mounted on the Eiffel Tower in Paris. The upper limit of measurement of piston manometers is about 3.5 GN/m2 (3.5 × 108 mm H2O). In this case, the height of the apparatus is no more than 2.5 m. Measurement of such pressures with a mercury manometer would necessitate an increase in its height to 26.5 km.
The most widespread piston-type manometers have a plain unsealed piston (Figure 3). The space below the piston is filled with oil, which flows under pressure into the clearance between the piston and the cylinder, which lubricates the sliding surfaces. Rotation of the piston relative to the cylinder prevents contact friction. The pressure is determined from the weights used to balance the pressure and from the cross section of the piston. The range of measurement may be varied within wide limits by changing the weights and the cross section of the piston; the range is 0.04-10.0 MN/m2, or 0.4-100.0 kilograms-force per sq cm (kgf/cm2), for piston manometers of this type. In this case, the errors of the most accurate standard manometers are not greater than 0.002-0.005 percent. A further increase in the upper limit of measurement makes the piston area so small that it becomes necessary to design special supports for the weights (support rods and lever devices). For example, to decrease the total weights used in the manometer based on the system of M. K. Zhokhovskii (USSR), the balancing force is generated by a hydraulic booster. In this case, even at the high pressures of 2.5 GN/m2 (2.5 × 104 kgf/cm2) the measuring device is of maximum compactness and does not require a large number of weights.
Piston manometers of special types are also used to measure small excess pressures and rarefactions, as well as absolute and atmospheric pressure. The piston systems of such manometers are usually given preliminary balancing by a special device, which makes possible a decrease in the lower limit of measurement virtually to zero. For example, the piston may be balanced by a spring mechanism. The piston is rotated by an electric motor. When a rarefaction is generated in the space above the upper part of the piston, the excess of atmospheric pressure balances the weights placed on its lower part.
Spherical and conical pistons are used in addition to cylindrical pistons. In bell manometers, the function of the piston is fulfilled by a bell, and in manometers of the “ring balance” type, by a flat partition within a hollow ring.
Piston-type manometers are used for calibrating and testing other types of manometers and for accurate measurement and monitoring of pressure, with input of readings to a numerical readout device or transmission over a distance.
The sensing element in spring (elastic) manometers is an elastic envelope that absorbs the pressure. The deformation of the envelope is a measure of the pressure that causes it. A distinction is made among tube, diaphragm, and bellows types, depending on the design of the sensing element. The principle of determining pressure from the elastic deformation of a thin envelope was proposed in 1846 by the German scientist R. Schintz; a specific aspect of this method, consisting in the determination of pressure from the deformation of a hollow tubular spring, was proposed in 1848 by the French scientist E. Bourdon, after whom the tubular spring has been named the Bourdon tube. The measurement limits of spring manometers include a wide range of pressures (from 10 N/m2 to 1000 MN/m2, or 1-108 mm H2O).
Simplicity of operation, compactness of design, and convenience of use have led to industrial use of manometers of the elastic type. The simplest tubular manometer (Figure 4) contains a hollow tube bent into an arc, one end of which is attached to the volume whose pressure is to be measured, and the other, which is sealed, is connected to the lever of the transmission mechanism. A change in pressure causes deformation of the tube, and the displacement of its end is transmitted to a pointer, which indicates the pressure on a scale. In addition to tubular springs, a diaphragm or bellows may be used in elastic manometers. In addition to mechanical conversion of the deformation of the sensing element into manometer readings, electrical and optical methods of transformation are used, including those involving the transmission of the results of measurements over a distance.
Systems of automatic control and monitoring of industrial processes use elastic manometers with force compensation (in terms of the method of measurement). In this case, the manometer consists of a measuring unit and a standard electric or pneumatic force transducer. The measured pressure is transformed by the sensing element into a force, which is balanced by a force generated by a feedback mechanism rather than by the deformation of the sensor. A standard electric or pneumatic signal proportional to the pressure being measured is generated at the output of the transducer. This system makes it possible to use the same transducer in manometers for determining absolute or excess pressure, as well as vacuum, pressure differences, and parameters of heat and mass transfer, such as the temperature, level, density, and feed rate. In this case, the measurement limits may be changed over a wide range by varying the ratios of the lever arms of the transducer and the area of the bellows. The absolute pressure measurement unit consists of two bellows (Figure 5) attached to a T-shaped lever of the transducer. A vacuum is generated in one of the bellows and the second bellows communicates with the volume in which the pressure is to be measured. The action of the pressure presses the restrictor (stop) on the T-shaped lever against the nozzle, which leads to an increase of pressure in the feedback bellows and the generation of a balancing force. The transducer is supplied with compressed air from an external source. The output pressure is transmitted by means of a pneumatic amplifier to the instrumentation, which records the results of measurement.
The use of manometers of the above types is difficult or impossible
when the pressures to be measured are either very high (above 2.5 MN/m2) or close to zero (below 10 N/m2). These cases require the use of manometers based on the measurement of some parameter related to pressure by a fixed relationship. lonization, thermal, viscosity, and radiometric manometers are used for measurement of low absolute pressures. An example of manometers used for the measurement of high pressures are Manganin manometers, in which the electric resistance of a thin Manganin wire is measured as a function of pressure. Manometers based on the magnetostrictive effect and the speed of sound are also used.
Manometers based on the change in the melting point of mercury with pressure are distinguished by high precision. The transition of mercury from the solid to liquid state is accompanied by a sharp change in the volume, which makes possible accurate recording of the temperature and pressure that correspond to the instant of melting and provides good reproducibility of results. A measuring unit with a manometer of this type makes it possible to determine pressures of up to 4 GN/m2 (approximately 4 × 108 mm H2O) with an error not exceeding 1 percent and is used as a standard for superhigh pressure in calibrating and testing manometers for use up to 4 GN/m2.
Further improvement of manometers includes an increase in their accuracy, extension of the range of measurement, and an increase in reliability, durability, and convenience of use. An increase in accuracy is facilitated by the use of such materials as age-hardenable alloys, quartz (for example, in making the sensing elements of spring manometers) and of elastic supports and optical and electrical methods of transmitting and recording readings. Various means for the transmission of the results of measurements to devices with numerical readout, as well as recorders and printers, which may be located at considerable distances from the measurement sites (for example, transmission of the results of atmospheric pressure measurements from Mars and Venus during flybys of artificial satellites), are used in automatic measurement.
REFERENCESZhokhovskii, M. K. Tekhnika izmereniia davleniia i razrezheniia, 2nd ed. Moscow, 1952.
Zhokhovskii, M. K. Teoriia i raschetpriborov s neuplotnennym porshnem, 2nd ed. Moscow, 1966.
Andriukhina, O. V., and V. N. Gramenitskii. Obraztsovye gruzoporshnevyepribory dlia izmereniia davleniia, sily i massy [survey]. Moscow, 1969.
Khansuvarov, K. I. Tochnye pribory dlia izmereniia absoliutnogo davleniia. Moscow, 1971.
K. I. KHANSUVAROV