parallel processing


Also found in: Dictionary, Thesaurus, Medical, Wikipedia.

parallel processing,

the concurrent or simultaneous execution of two or more parts of a single computer programcomputer program,
a series of instructions that a computer can interpret and execute; programs are also called software to distinguish them from hardware, the physical equipment used in data processing.
..... Click the link for more information.
, at speeds far exceeding those of a conventional computercomputer,
device capable of performing a series of arithmetic or logical operations. A computer is distinguished from a calculating machine, such as an electronic calculator, by being able to store a computer program (so that it can repeat its operations and make logical
..... Click the link for more information.
. Parallel processing requires two or more interconnected processors, each of which executes a portion of the task; some supercomputer parallel-processing systems have hundreds of thousands of microprocessorsmicroprocessor,
integrated circuit containing the arithmetic, logic, and control circuitry required to interpret and execute instructions from a computer program. When combined with other integrated circuits that provide storage for data and programs, often on a single
..... Click the link for more information.
. The processors access data through shared memory. The efficiency of parallel processing is dependent upon the development of programming languagesprogramming language,
syntax, grammar, and symbols or words used to give instructions to a computer. Development of Low-Level Languages

All computers operate by following machine language programs, a long sequence of instructions called machine code that is
..... Click the link for more information.
 that optimize the division of the tasks among the processors.

Bibliography

See E. Rietman, Exploring Parallel Processing (1990); K. M. Chandy and S. Taylor, An Introduction to Parallel Programming (1992); D. I. Moldovon, Parallel Processing from Applications to Systems (1993); G. S. Almasi and A. Gottlieb, Highly Parallel Computing (1993).

parallel processing

[¦par·ə‚lel ′prä·sə·siŋ]
(psychology)
The processing of several pieces of information at the same time.

parallel processing

(parallel)
(Or "multiprocessing") The simultaneous use of more than one computer to solve a problem. There are many different kinds of parallel computer (or "parallel processor"). They are distinguished by the kind of interconnection between processors (known as "processing elements" or PEs) and between processors and memory. Flynn's taxonomy also classifies parallel (and serial) computers according to whether all processors execute the same instructions at the same time ("single instruction/multiple data" - SIMD) or each processor executes different instructions ("multiple instruction/multiple data" - MIMD).

The processors may either communicate in order to be able to cooperate in solving a problem or they may run completely independently, possibly under the control of another processor which distributes work to the others and collects results from them (a "processor farm"). The difficulty of cooperative problem solving is aptly demonstrated by the following dubious reasoning:

If it takes one man one minute to dig a post-hole then sixty men can dig it in one second.

Amdahl's Law states this more formally.

Processors communicate via some kind of network or bus or a combination of both. Memory may be either shared memory (all processors have equal access to all memory) or private (each processor has its own memory - "distributed memory") or a combination of both.

Many different software systems have been designed for programming parallel computers, both at the operating system and programming language level. These systems must provide mechanisms for partitioning the overall problem into separate tasks and allocating tasks to processors. Such mechanisms may provide either implicit parallelism - the system (the compiler or some other program) partitions the problem and allocates tasks to processors automatically or explicit parallelism where the programmer must annotate his program to show how it is to be partitioned. It is also usual to provide synchronisation primitives such as semaphores and monitors to allow processes to share resources without conflict.

Load balancing attempts to keep all processors busy by allocating new tasks, or by moving existing tasks between processors, according to some algorithm.

Communication between tasks may be either via shared memory or message passing. Either may be implemented in terms of the other and in fact, at the lowest level, shared memory uses message passing since the address and data signals which flow between processor and memory may be considered as messages.

The terms "parallel processing" and "multiprocessing" imply multiple processors working on one task whereas "concurrent processing" and "multitasking" imply a single processor sharing its time between several tasks.

See also cellular automaton,symmetric multi-processing.

Usenet newsgroup: news:comp.parallel.

Institutions, research groups.

parallel processing

(1) An architecture within a single computer that performs more than one operation at the same time. See GPGPU, pipeline processing and vector processor.

(2) An architecture using multiple computers. See parallel computing.
References in periodicals archive ?
With their speedups of 502 to 637 from the fixed-sized problems, Benner and his colleagues were also the recipients earlier this month of this first Gordon Bell Award, which was established to acknowledge important contributions to parallel processing applied to real problems.
Savant/CraySolutions will also collaborate in Cray Research's overall pursuit of the commercial parallel processing market for decision-support solutions.
This CRAY-3/Super Scalable System will provide high-performance vector parallel processing, scalable parallel processing and the combination of both in a hybrid mode featuring extremely high bandwidth between the PIM processor array and the CRAY-3.
Support for the company's CRAY T3D massively parallel processing (MPP) systems is planned for the second release of the CF90 Programming Environment, due out about a year from now, she said.
Rogue Wave Hydra is based on Rogue Wave Software's pioneering "Pipelines" technology and associated methodology, which focuses on achieving efficiency and scalability through parallel processing.
3 /PRNewswire/ -- Intel Corporation (NASDAQ-NMS: INTC) and Unisys Corporation (NYSE: UIS) today extended their corporate technology development alliance to bring the benefits of scalable parallel processing to commercial markets.
Featuring true parallel processing, the CS49530 family allows AVR manufacturers to support extensive audio post-processing simultaneous with a multichannel audio decoder.
17 ~PRNewswire~ -- IBM (NYSE: IBM) today anno unced that it is offering for the first time a parallel processing model in its RISC System~6000 (See Note) line of advanced workstations and servers.
Leveraging Rogue Wave Software's pioneering "Pipelines" technology, which focuses on achieving efficiency and scalability through parallel processing, Rogue Wave Hydra will empower IT architects and professional developers to achieve order-of-magnitude performance and throughput improvements for critical software applications.
SILICON GRAPHICS' PARALLEL PROCESSING SYSTEMS SPEED UNITED'S
Leveraging their patented deeply coupled, fine grain parallel processing System-on-Chip technology and advanced system design tools, the company will tailor a computer to optimally execute a customer's applications.

Full browser ?