perceptron


Also found in: Wikipedia.

perceptron

[pər′sep‚trän]
(computer science)
A pattern recognition machine, based on an analogy to the human nervous system, capable of learning by means of a feedback system which reinforces correct answers and discourages wrong ones.

Perceptron

 

a mathematical model of the process of perception. A person recognizes newly encountered phenomena or objects by classifying them under some concept (a class). Thus, for instance, acquaintances are easily recognized even after a haircut or a change of clothing; manuscripts are easily read although each person’s handwriting has its own distinctive features; and different arrangements of a melody can be recognized as variations on a theme. This ability in humans is called the phenomenon of perception. On the basis of experience, a person can also develop new concepts and learn a new system of classification. For example, in learning to recognize the difference between various letter symbols, a student is shown the symbols and told to which letters the symbols correspond, that is, under which classes the symbols fall, with the result that the student eventually develops the capacity for correct classification.

Figure 1. Simplified schematic diagram of a perceptron: S units represent sensory neurons; A units represent associative neurons; R units represent reactive neurons; arrows indicate the direction of impulses through synaptic junctions

It is believed that perception is accomplished through a network of neurons. A model of perception can be represented as having three layers of neurons: a sensory layer, or S-layer, an associative layer, or A-layer, and a eactive layer, or R-layer (Figure 1). According to the simplest model, which was proposed by W. McCulloch and W. Pitts, a neuron is a nerve cell that has several inputs and one output. The inputs may be either stimulating or inhibiting. A neuron is excited and sends an impulse if the number of signals at the exciting inputs exceeds the number of signals at the inhibiting input by a certain quantity, which is called the neuron threshold. Depending on the nature of the external stimulus, a collection of impulses, or signals, is formed in the S-layer. Traveling through the nerve pathways, these impulses reach the neurons of the A-layer, where new impulses that are fed to the inputs of R-layer neurons are formed in such a way as to correspond to the collection of impulses that originated in the S-layer. In A-layer neurons, all input signals are summed with the same amplification coefficient, although the sign of the coefficient may differ; both the amplification and sign can differ among signals that are summed in R-layer neurons. The perception of an object corresponds to the excitation of a specific neuron in the R-layer. It is believed that the amplification coefficients of reactive neurons are selected so that the collection of impulses that excite a given R-layer neuron corresponds to an entire class of different objects. A new concept can form once the amplification coefficient of the corresponding reactive neuron becomes established.

In 1957 the American scientist F. Rosenblatt constructed a perceptron that he called Mark 1, a model of a visual analyzer. A photoelectric cell served as a model of a sensory neuron; a threshold unit with an amplification coefficient of ±1 served as a model of an associative neuron, and a threshold unit with adjustable coefficients served as the model of a reactive neuron. The inputs of the A-layer threshold units were connected randomly with the photoelectric cells. Rosenblatt’s perceptron was designed for work in the operation mode and learning mode. The perceptron classified the situations that were presented to it in the operation mode; if of all R elements only the element Ri was stimulated, then the situation fell under the ith class. The amplification factors of the R-layer threshold units were worked out in the course of learning a sequence of examples that were offered for assimilation.

The Mark 1 was the first of several models of perception. Subsequently, the process of perception was investigated with models that were based on digital computers. In the 1960’s models of perception were called perceptrons, or perceptive schemes; in these, a distinction was made between the sensory part, the associative part, and the reactive threshold units. The sensory part forms a vector , which corresponds to each object to be assimilated; this vector is converted by the associative part into the vector . The vector ȳ belongs to the jth class if the corresponding weighted sum of the reactive Rj element exceeds the response threshold. The mathematical investigation of perceptron schemes is connected with the task of teaching pattern recognition, which determines how the associative part must be constructed and what the algorithm should be for establishing the amplification factor of R units in the learning mode.

REFERENCES

Rosenblatt, F. Printsipy neirodinamiki. Moscow, 1965. (Translated from English.)
Minsky, M., and S. Papert. Perseptrony. Moscow, 1971. (Translated from English.)
Vapnik, V. N., and A. Ia. Chervonenkis. Teoriia raspoznavaniia obrazov. Moscow, 1974.

V. N. VAPNIK

perceptron

(1)

perceptron

(2)
A network of neurons in which the output(s) of some neurons are connected through weighted connections to the input(s) of other neurons. A multilayer perceptron is a specific instance of this.
References in periodicals archive ?
* Perceptron, Inc.(NASDAQ: PRCP) is estimated to post quarterly loss at $0.06 per share on revenue of $16.51 million.
We train, for each semantic class, a binary multilayer perceptron (MLP) classifier (with one hidden layer) that takes an embedding as input and predicts membership in the semantic class.
Perceptron announced the release of its revolutionary AutoFit solution for gap and flush measurement on transparent head lamps and tail lamps, chrome trim, and painted surfaces.
Assim sendo, o ajuste, correto, dos parametros de uma RNA Multi Layer Perceptron (numero de neuronios na camada intermediaria, taxa de aprendizado, momentum) e essencial para um bom treinamento e para a validacao da capacidade de generalizacao da mesma.
In this study, nine classification algorithms of Alternating Decision Tree (ADTree), K-nearest neighbor (K-NN), LI Regularized Logistic Regression (LI RLR), L2 Regularized Logistic Regression (L2 RLR), Multilayer Perceptron classifier (MLPClassifier), Random Forest (RF), Radial Basis Function classifier (RBF-Classifier), RealAdaBoost and Sequential Minimal Optimization (SMO) algorithm were used to predict the nucleic acid-binding function of proteins {i.e., RNA- and DNA-binding) and also to differentiate between RBPs and DBPs.
Multilayer Perceptron Neural Network model was used and trained using data spanning five generations of graduates from engineering department of university of Ibadan.
This paper selected J48 algorithm under the decision tree while Multi-layered perceptron was picked from the Neural Network.
Five classifier algorithms--K-nearest neighbors, logistic regression, linear support vector machine, decision tree, and neural network multilayer perceptron using standard hyper parameters from the scikit-learn library (Pedregosa et al.
"We are excited to be able to use our Azure Active Directory with Epicor ERP," said Mark Wonsil, ERP specialist, Perceptron, Inc.
The most optimal network architecture for data analysis was determined as a result of the research and it is called a multilayer perceptron. The analysis of the obtained training results has showed that the highest performance indicators are provided by the ANN MJI[PI] 9-20-14.
Furthermore, the discussion continues on the Multilayer Perceptron (MLP) and the supervised and unsupervised learning techniques are also briefly explained.
The NN is a three-layer perceptron neural network which can adjust the gait transition factors [f.sub.p] (probing) and [f.sub.s] (sliding) real-time based on the velocity (V) and surface roughness of ground (S) acquired from CNS.