chip rate

(redirected from chipping rate)

chip rate

In direct sequence spread spectrum technologies such as DSSS and CDMA, it is the number of bits per second (chips per second) used in the spreading signal. A different spreading signal is added to the data signal to code each transmission uniquely. The number of chips (bits) in the spreading signal is significantly greater than the data bits. Chip rate is measured in "megachips per second" (Mcps), which is millions of chips per second. See spread spectrum, CDMA and 802.11.
Mentioned in ?
References in periodicals archive ?
naledi has more than twice the chipping rate of (http://humanorigins.si.edu/evidence/human-fossils/species/australopithecus-africanus) Australopithecus africanus , and four times that of (http://humanorigins.si.edu/evidence/human-fossils/species/paranthropus-robustus) Paranthropus robustus , two extinct hominin species often thought to have commonly consumed hard foods (though there's still a lot of discussion about exactly what their diet consisted of).
Certain samples of modern humans also show a similar chipping rate to H.
On a per-load basis, chipping rate averaged 29 ovendry tons (ODT) per chipping hour and ranged from 21 to 37 ODT/hr.
Chipping rate increased with average tree size, calculated from the load weight and tree count for each load.
where Chipping rate = ODT produced per chipping hour; Hours = chipping hours, at the start of the load, since the beginning of the shift; Delimb = a dummy variable with a value of 1 for delimbed trees, 0 for whole trees; Chip weight per tree = ovendry pounds of chips per tree.
(The chipper knives became dull during the last load, reducing productivity to 30.3 ODT/chipping hr.) Raising the chipper infeed roller or increasing the speeds of the delimber infeed rollers would probably have increased chipping rate. As noted earlier, trees were cut a week or more before processing to allow the foliage to dry.