von Neumann bottleneck

von Neumann bottleneck

[fȯn ′nȯi‚män ′bäd·əl‚nek]
(computer science)
An inefficiency inherent in the design of any von Neumann machine that arises from the fact that most computer time is spent in moving information between storage and the central processing unit rather than operating on it.
References in periodicals archive ?
By utilising the "computing-in-memory", it is possible to solve the long-lasting problem of the "von Neumann bottleneck": the need to continually shuffle data between processing cores and memory.
There are several other lines of research aimed at eliminating the von Neumann bottleneck, including: Processing-in-Memory (PIM) technology; quantum computing; and even neuromorphic computing inspired by human brain structure.
To overcome this so-called von Neumann bottleneck, it is not sufficient to optically connect memory and processor, as the optical signals have to be converted into electric signals again.
There are two main problems in computer science today: the low energy efficiency of existing RAM and processors, and then the harder problem--the low performance caused by the Von Neumann bottleneck. HP can make a killing by just solving the first problem.
With separated memory and processing units, this requires multiple steps between the memory and CPU for data processing, referred to as the von Neumann bottleneck. Even with parallel processing, the current architecture is inadequate to process the continually growing big data that futurists are now working with for forecasting.
This is called the von Neumann bottleneck. The bigger we build the machines, the worse it gets" (Hillis 1985, p.