# decimal

(redirected from*decimal notation*)

Also found in: Dictionary, Thesaurus, Wikipedia.

## decimal

**1.**a fraction that has a denominator of a power of ten, the power depending on or deciding the decimal place. It is indicated by a decimal point to the left of the numerator, the denominator being omitted. Zeros are inserted between the point and the numerator, if necessary, to obtain the correct decimal place

**2.**any number used in the decimal system

**3.**

**a.**relating to or using powers of ten

**b.**of the base ten

**4.**expressed as a decimal

*The Great Soviet Encyclopedia*(1979). It might be outdated or ideologically biased.

## Decimal

a fraction whose denominator is a whole power of the number 10. The decimal is written without a denominator, setting off in the numerator to the right of the decimal point as many digits as there are zeros in the denominator (for example, 485,634/1,000 = 485.634 and 3/100 = 0.03). In such notation, the part to the left of the decimal point designates the integer part of the fraction. The first digit after the decimal point designates the number of tenths; the second, the number of hundredths; and so forth.

The decimal notation of rational numbers whose denominator does not have other prime factors except 2 and 5 contains a finite number of digits (for example, 4/25 = 0.16). In general, the digits in the decimal notation of a rational number begin repeating at some position; such a number is an infinite repeating decimal (for example, 7/6 = 1.1666 …). Irrational numbers are nonrepeating infinite decimals (for example, = 1.41421 . … In all cases, the decimal of *a _{k}*

*a*…

_{k-1}*a*

_{0}*b*

_{1}*b*… can be written in the form

_{2}where *a*_{k}, *a _{k-1}*, … ,

*a*

_{0},

*b*

_{1}

*b*

_{2}, are the numerals 0, 1, 2, … , 9 (

*a*

_{k}≠ 0) in the corresponding digit of the number. For example, 382.1274 = 3 x 10

^{2}+ 8 × 10 + 2 + 1/10 + 2/10

^{2}+ 7/10

^{3}+ 4/10

^{4}, that is, here

*a*= 3,

_{2}*a*

_{1}= 8,

*a*= 2,

_{0}*b*

_{1}= 1,

*b*

_{2}= 2,

*b*

_{3}= 7, and

*b*

_{4}= 4. Decimals were already used in the 14th-15th centuries. The Samarkand mathematician Al Kashi described the decimal system in 1427. In Europe, the decimal was introduced by S. Stevin in 1584.

## decimal

[′des·məl]## decimal

Meaning 10. The numbering system used by humans, which is based on 10 digits. In contrast, computers use binary numbers because it is easier to design electronic systems that can maintain two states rather than 10.**The Computer Language Company Inc**. All Rights reserved. THIS DEFINITION IS FOR PERSONAL USE ONLY. All other reproduction is strictly prohibited without permission from the publisher.