parallel processing


Also found in: Dictionary, Thesaurus, Wikipedia.

parallel processing,

the concurrent or simultaneous execution of two or more parts of a single computer programcomputer program,
a series of instructions that a computer can interpret and execute; programs are also called software to distinguish them from hardware, the physical equipment used in data processing.
..... Click the link for more information.
, at speeds far exceeding those of a conventional computercomputer,
device capable of performing a series of arithmetic or logical operations. A computer is distinguished from a calculating machine, such as an electronic calculator, by being able to store a computer program (so that it can repeat its operations and make logical
..... Click the link for more information.
. Parallel processing requires two or more interconnected processors, each of which executes a portion of the task; some supercomputer parallel-processing systems have hundreds of thousands of microprocessorsmicroprocessor,
integrated circuit containing the arithmetic, logic, and control circuitry required to interpret and execute instructions from a computer program. When combined with other integrated circuits that provide storage for data and programs, often on a single
..... Click the link for more information.
. The processors access data through shared memory. The efficiency of parallel processing is dependent upon the development of programming languagesprogramming language,
syntax, grammar, and symbols or words used to give instructions to a computer. Development of Low-Level Languages

All computers operate by following machine language programs, a long sequence of instructions called machine code that is
..... Click the link for more information.
 that optimize the division of the tasks among the processors.

Bibliography

See E. Rietman, Exploring Parallel Processing (1990); K. M. Chandy and S. Taylor, An Introduction to Parallel Programming (1992); D. I. Moldovon, Parallel Processing from Applications to Systems (1993); G. S. Almasi and A. Gottlieb, Highly Parallel Computing (1993).

parallel processing

[¦par·ə‚lel ′prä·sə·siŋ]
(psychology)
The processing of several pieces of information at the same time.

parallel processing

(parallel)
(Or "multiprocessing") The simultaneous use of more than one computer to solve a problem. There are many different kinds of parallel computer (or "parallel processor"). They are distinguished by the kind of interconnection between processors (known as "processing elements" or PEs) and between processors and memory. Flynn's taxonomy also classifies parallel (and serial) computers according to whether all processors execute the same instructions at the same time ("single instruction/multiple data" - SIMD) or each processor executes different instructions ("multiple instruction/multiple data" - MIMD).

The processors may either communicate in order to be able to cooperate in solving a problem or they may run completely independently, possibly under the control of another processor which distributes work to the others and collects results from them (a "processor farm"). The difficulty of cooperative problem solving is aptly demonstrated by the following dubious reasoning:

If it takes one man one minute to dig a post-hole then sixty men can dig it in one second.

Amdahl's Law states this more formally.

Processors communicate via some kind of network or bus or a combination of both. Memory may be either shared memory (all processors have equal access to all memory) or private (each processor has its own memory - "distributed memory") or a combination of both.

Many different software systems have been designed for programming parallel computers, both at the operating system and programming language level. These systems must provide mechanisms for partitioning the overall problem into separate tasks and allocating tasks to processors. Such mechanisms may provide either implicit parallelism - the system (the compiler or some other program) partitions the problem and allocates tasks to processors automatically or explicit parallelism where the programmer must annotate his program to show how it is to be partitioned. It is also usual to provide synchronisation primitives such as semaphores and monitors to allow processes to share resources without conflict.

Load balancing attempts to keep all processors busy by allocating new tasks, or by moving existing tasks between processors, according to some algorithm.

Communication between tasks may be either via shared memory or message passing. Either may be implemented in terms of the other and in fact, at the lowest level, shared memory uses message passing since the address and data signals which flow between processor and memory may be considered as messages.

The terms "parallel processing" and "multiprocessing" imply multiple processors working on one task whereas "concurrent processing" and "multitasking" imply a single processor sharing its time between several tasks.

See also cellular automaton,symmetric multi-processing.

Usenet newsgroup: news:comp.parallel.

Institutions, research groups.

parallel processing

(1) An architecture within a single computer that performs more than one operation at the same time. See GPGPU, pipeline processing and vector processor.

(2) An architecture using multiple computers. See parallel computing.
References in periodicals archive ?
"Moreover, this inexpensive parallel processing device opens up new horizons not only for educational purposes, to learn parallel processing, but the cluster can be modified to incorporate multiple applications like cryptocurrency mining machine, mini-web server, performing scientific simulations or calculating mathematical operations."
Using this strategy, 48 slides can be processed in parallel, a 60 percent increase over the typical 30 slide parallel processing limitation.
In addition to the use of parallel processing, further increases in simulation speed can be gained by the Space Charge module's ability to exploit model symmetry.
"The main change is that it can perform parallel processing using server memory through the use of business analytics software [from SAS] ," Yara added.
The AMD Accelerated Parallel Processing SDK v2.3, previously known as the ATI Stream SDK, allows software developers to write new applications that can leverage the parallel processing power of heterogeneous computing platforms, such as those based on new AMD E-Series and C-Series APUs that combine a multi-core CPU and DirectX 11-capable GPU on a single die.
It includes over 100 MATLAB examples and wavelet techniques to illustrate current applications of digital signal processing, including image processing, filters, games, transforms, networking, parallel processing, and sound.
The kit solution can be scaled up to a 96-well plate format for automated or parallel processing for high-throughput environments.
With CMOS sensors, while high-speed readout for high pixel counts is achieved through parallel processing, an increase in parallel-processing signal counts can result in such problems as signal delays and minor deviations in timing.
CMOS sensors achieve a high-speed readout for high pixel counts through parallel processing, so an increase in parallel-processing signal counts can result in problems such as signal delays and minor deviations in timing.
* Interchangeable table tops and trolley system, allowing facilities to develop a high-efficiency parallel processing environment and reduce non-operative OR time.
The upgrade supports parallel processing on standard multicore-powered desktops and cluster computing distributed memory systems.
announced today that CyberLink MediaShow 5, a new software program that organises digital photos based on who is in them, is utilising the CUDA parallel processing power of NVIDIAA GeForceA graphics processing units (GPUs) to search and sort photo libraries.

Full browser ?