assumed decimal point

assumed decimal point

[ə′sümd ′des·məl ‚pȯint]
(computer science)
For a decimal number stored in a computer or appearing on a printout, a position in the number at which place values change from positive to negative powers of 10, but to which no location is assigned or at which no printed character appears, as opposed to an actual decimal point. Also known as virtual decimal point.
Mentioned in ?