cache coherency


Also found in: Dictionary, Thesaurus, Medical, Legal, Acronyms, Wikipedia.

cache coherency

(storage)
(Or "cache consistency") /kash koh-heer'n-see/ The synchronisation of data in multiple caches such that reading a memory location via any cache will return the most recent data written to that location via any (other) cache.

Some parallel processors do not cache accesses to shared memory to avoid the issue of cache coherency. If caches are used with shared memory then some system is required to detect when data in one processor's cache should be discarded or replaced because another processor has updated that memory location. Several such schemes have been devised.

cache coherency

Managing a cache so that data are not lost or overwritten. For example, when data are updated in a cache but not yet transferred to the target memory or disk, the chance of corruption is greater. Accomplished by well-designed algorithms that keep track of every read and write event, cache coherency is even more critical in symmetric multiprocessing (SMP) where memory is shared by multiple processors. See cache and SMP.
References in periodicals archive ?
However, unless the RAID controllers doing the caching are configured in dual active pairs and designed with cache coherency and robust recovery mechanisms, caching can cause incorrect data to be delivered to applications and corrupt databases when elements in the I/O path fail.
Features like cache mirroring and cache coherency have been available in RAID controllers for mainframes and high-end Unix systems for some time; however, a new generation of RAID controllers that include these features and priced for NT servers is beginning to hit the market.