cache line


Also found in: Dictionary, Thesaurus, Medical, Legal.
Related to cache line: cache memory

cache line

(storage)
(Or cache block) The smallest unit of memory than can be transferred between the main memory and the cache.

Rather than reading a single word or byte from main memory at a time, each cache entry is usually holds a certain number of words, known as a "cache line" or "cache block" and a whole line is read and cached at once. This takes advantage of the principle of locality of reference: if one location is read then nearby locations (particularly following locations) are likely to be read soon afterward. It can also take advantage of page-mode DRAM which allows faster access to consecutive locations.

cache line

The block of memory that is transferred to a memory cache. The cache line is generally fixed in size, typically ranging from 16 to 256 bytes. The effectiveness of the line size depends on the application, and cache circuits may be configurable to a different line size by the system designer. There are also numerous algorithms for dynamically adjusting line size in real time. See cache.
References in periodicals archive ?
In this Tag cache contains 19 tag bits, 1 valid bit, 1 dirty bit and 7 bits LRU (Least Recently Used)is the important concept of each data cache line, tag bits are used for holding the 19 tag bits (shown in fig 3)comes from the cache size controller unit of the address being accessed, the every valid bit indicates whether the cache line is valid or not and dirty bit is set when the cache line is written without updating the corresponding main memory line, when the machine restarts all valid and dirty bits are reset.
The alignment of memory determines if there is a need to fetch the transactions or cache lines.
When occurs the cache line loading from dynamic RAM, the two LSB bits, are obtained from the numerator, which increments the address with four locations.
Interference occurs in the cache system when data belonging to one thread is evicted by a cache line from another thread.
Figure 4(b) shows an example of how the prefetch filtering mechanism works, and Figure 5 summarizes the states and transitions of the prefetch bit and the saturation counter for a particular cache line.
Under a fully associative scheme, any cache line can be used to store any memory block.
Table IV also displays some cache memory characteristics, namely the associativity of the cache, cache size (in kilobytes), and cache line size (in bytes).
To maximize throughput and thus increase overall performance, the cache communicates with memory mostly via burst operations that allow a cache line to be filled in one transaction.
Conventional design is breaking down because the natural progression to DDR3 causes an architecture incompatibility between the optimal 64-byte burst sizes required for single-channel DDR3 DRAMs and the underlying processor cache line and primitive data object sizes, such as MPEG, which are normally 32-bytes or less.
The remaining nodes caching the line are joined together in a distributed, doubly linked list, using additional pointers that are associated with each cache line in a node (which are known as forward and backward pointers).
For each cache line in the page, the processor forces the cache-coherence hardware to issue an invalidation for the cache line.
A memory line refers to a cache-line-sized block in the memory, while a cache line refers to the actual cache block to which a memory line is mapped to.