message passing


Also found in: Medical, Wikipedia.

message passing

One of the two techniques for communicating between parallel processes (the other being shared memory).

A common use of message passing is for communication in a parallel computer. A process running on one processor may send a message to a process running on the same processor or another. The actual transmission of the message is usually handled by the run-time support of the language in which the processes are written, or by the operating system.

Message passing scales better than shared memory, which is generally used in computers with relatively few processors. This is because the total communications bandwidth usually increases with the number of processors.

A message passing system provides primitives for sending and receiving messages. These primitives may by either synchronous or asynchronous or both. A synchronous send will not complete (will not allow the sender to proceed) until the receiving process has received the message. This allows the sender to know whether the message was received successfully or not (like when you speak to someone on the telephone). An asynchronous send simply queues the message for transmission without waiting for it to be received (like posting a letter). A synchronous receive primitive will wait until there is a message to read whereas an asynchronous receive will return immediately, either with a message or to say that no message has arrived.

Messages may be sent to a named process or to a named mailbox which may be readable by one or many processes.

Transmission involves determining the location of the recipient and then choosing a route to reach that location. The message may be transmitted in one go or may be split into packets which are transmitted independently (e.g. using wormhole routing) and reassembled at the receiver. The message passing system must ensure that sufficient memory is available to buffer the message at its destination and at intermediate nodes.

Messages may be typed or untyped at the programming language level. They may have a priority, allowing the receiver to read the highest priority messages first.

Some message passing computers are the MIT J-Machine, the Illinois Concert Project and transputer-based systems.

Object-oriented programming uses message passing between objects as a metaphor for procedure call.

message passing

Transmitting a message from one computer to another or transferring a message within the computer from one application to another or from the operating system to an application. See MPI.
References in periodicals archive ?
The team also developed a small API (application programming interface) library for message passing among the cores, called RCCE, and which Mattson pronounced as "Rocky.
CNW cuts the required memory bandwidth for intra-node message passing in half.
Platform Computing announced today that it has reached an agreement with HP to acquire the assets and intellectual property associated with HP's proprietary implementation of the Message Passing Interface (MPI).
This workflow avoids parallel computing methods that involve low-level programming with languages such as FORTRAN or C and libraries such as the Message Passing Interface (MPI).
Our system-on-a-chip solutions can be scaled in multi-chip implementations for large transport equipment designs, and OSE message passing makes it much easier to scale multiprocessor systems.
The first public demonstration of the Interoperable Message Passing Interface (IMPI) took place at the Supercomputing 2000 (SC'2000) Conference held in Dallas, TX, in November 2000.
The technology got a boost in the early 1990s, when industry adopted a standard message passing interface that allows people to use more than one machine to work on a single calculation.
There are two approaches to putting together uniprocessor computers to make bigger machines: shared memory and message passing.
This reduces the complexity for application developers, but does not unnecessarily abstract them from the hardware and introduce additional overhead as would be the case with a conventional message passing based implementation.
TransComm consists of a library of optimized routines that enable prioritized data movement and message passing between tasks, where the tasks may reside on any processor, board or set of boards connected via VXS switched fabrics.
Key words: discovery science; distributed processing; immersive environments; IMPI; interoperable MPI; message passing interface; MPI; parallel processing; scientific visualization.
Over this infrastructure, tools such as an open message passing interface (MPI) can be used to run MPI applications across multiple locations, supporting, for example, the use of Myricom protocols within one cluster and Gigabit Ethernet between clusters and campuses, leveraging the long haul fiber optic capabilities of the TeraScale E-Series.

Full browser ?