back-propagation


Also found in: Dictionary.

back-propagation

(Or "backpropagation") A learning algorithm for modifying a feed-forward neural network which minimises a continuous "error function" or "objective function." Back-propagation is a "gradient descent" method of training in that it uses gradient information to modify the network weights to decrease the value of the error function on subsequent tests of the inputs. Other gradient-based methods from numerical analysis can be used to train networks more efficiently.

Back-propagation makes use of a mathematical trick when the network is simulated on a digital computer, yielding in just two traversals of the network (once forward, and once back) both the difference between the desired and actual output, and the derivatives of this difference with respect to the connection weights.
References in periodicals archive ?
In particular, this research will apply a back-propagation (BP) NN model to predict design cost estimates for freeway pavement construction projects, using historical data on freeway construction projects in Henan Province as a case study of the application of the approach.
Back-Propagation Neural Network represents an important technique to define nonlinear transfer functions between continuous input values and one or more output values.
For the development process it was selected a classification Multilayer Perceptron using a back-propagation algorithm.
Using back-propagation training algorithm learning rates, momentum and number of iterations parameters can be chosen.
The training of the perceptron is done by adjusting the synaptic weights by an iterative algorithm, based on the error back-propagation rule.
In addition, we used a batch back-propagation learning algorithm, the most popular algorithm to train multi-layer perceptrons; this algorithm is often used by researchers and practitioners.
Multilayer feed forward neural network with Error Back-Propagation Learning was used for approximation of measured CO/lambda biomass combustion dependence.
Furuya: "A Complex Back-propagation learning", Transactions of Information Processing Society of Japan, Vol.
While a majority of them used the back-propagation training algorithm, a few of them attempted other algorithms (Maier, H.
These premise are fine tuned by back-propagation like algorithm.
While in the second place, the descent gradient is calculated in a back-propagation fashion, which makes it possible to adjust the weights in a descent direction.
The model of the applied neural networks was the feed forward, with error back-propagation learning algorithm.