decimal(redirected from Decimals)
Also found in: Dictionary, Thesaurus, Wikipedia.
a fraction whose denominator is a whole power of the number 10. The decimal is written without a denominator, setting off in the numerator to the right of the decimal point as many digits as there are zeros in the denominator (for example, 485,634/1,000 = 485.634 and 3/100 = 0.03). In such notation, the part to the left of the decimal point designates the integer part of the fraction. The first digit after the decimal point designates the number of tenths; the second, the number of hundredths; and so forth.
The decimal notation of rational numbers whose denominator does not have other prime factors except 2 and 5 contains a finite number of digits (for example, 4/25 = 0.16). In general, the digits in the decimal notation of a rational number begin repeating at some position; such a number is an infinite repeating decimal (for example, 7/6 = 1.1666 …). Irrational numbers are nonrepeating infinite decimals (for example, = 1.41421 . … In all cases, the decimal of akak-1 … a0b1b2 … can be written in the form
where ak, ak-1, … , a0, b1b2, are the numerals 0, 1, 2, … , 9 (ak ≠ 0) in the corresponding digit of the number. For example, 382.1274 = 3 x 102 + 8 × 10 + 2 + 1/10 + 2/102 + 7/103 + 4/104, that is, here a2 = 3, a1 = 8,a0 = 2, b1 = 1, b2 = 2, b3 = 7, and b4 = 4. Decimals were already used in the 14th-15th centuries. The Samarkand mathematician Al Kashi described the decimal system in 1427. In Europe, the decimal was introduced by S. Stevin in 1584.