Miller code

(redirected from Delay encoding)
Also found in: Wikipedia.

Miller code

[′mil·ər ‚kōd]
(computer science)
A code used internally in some computers, in which a binary 1 is represented by a transition in the middle of a bit (either up or down), and a binary 0 is represented by no transition following a binary 1; a transition between bits represents successive 0's; in this code, the longest period possible without a transition is two bit times.
McGraw-Hill Dictionary of Scientific & Technical Terms, 6E, Copyright © 2003 by The McGraw-Hill Companies, Inc.
References in periodicals archive ?
This new product series is capable of encoding 4K/UHD video to approximately one five hundredth of its original size, while performing ultra-low delay encoding and decoding of approximately 99 milliseconds.
Tedjini, "Chipless RFID based on group delay encoding," IEEE International Conference on RFID Technologies and Applications (RFID-TA), 214-218, Barcelona, Spain, September 2011.
By combining this with the "ultra high quality and ultra low delay encoding algorithm," high quality video can be implemented through low bit rate transmissions.