back-propagation

(redirected from backpropagation)
Also found in: Dictionary, Wikipedia.

back-propagation

(Or "backpropagation") A learning algorithm for modifying a feed-forward neural network which minimises a continuous "error function" or "objective function." Back-propagation is a "gradient descent" method of training in that it uses gradient information to modify the network weights to decrease the value of the error function on subsequent tests of the inputs. Other gradient-based methods from numerical analysis can be used to train networks more efficiently.

Back-propagation makes use of a mathematical trick when the network is simulated on a digital computer, yielding in just two traversals of the network (once forward, and once back) both the difference between the desired and actual output, and the derivatives of this difference with respect to the connection weights.
References in periodicals archive ?
In general, the backpropagation perception algorithm is widely used to provide multi-layer perception in artificial neural network.
The learning algorithm for calibration purposes and subsequent validation of the models was the supervised second-order Levenberg-Marquardt algorithm (Shepherd, 1997), which is a variation on the backpropagation algorithm (Rumelhart et al.
In this paper we present simulation results comparing the performance of CMA and backpropagation to compensate the chromatic dispersion in a NGPON with QPSK phase modulation and coherent detection.
In backpropagation algorithm, there are two phases in its learning cycle one to propagate the input patterns through the network and other to adapt the output by changing the weights in the network.
Target Data Young's modulus & tensile strength at 0[degrees] and 90[degrees] fiber orientation Input Data LFT-D process parameters for fiber volume weights 25% 30% 35% 40% 45% 50% Network Type Feed-forward backpropagation (FFB) MATLAB Training Function Trainlm (Levenberg-Marquardt) Performance Function Mean Square Error (MSE) Number of Hidden Layers 1 Number of Input Neurons 52 Number of Hidden Neurons 105 Number of Output Neurons 4 Table 2.
Posteriormente se aplica el algoritmo backpropagation para un mejor refinamiento de la propuesta; siendo la aplicacion de algoritmos geneticos unido a backpropagation, una solucion potente para la evolucion de los pesos.
Only after several simulations of different ANN models we have opted for a Feed-Forward Backpropagation (FF-BP) ANN that give good results and also is relatively easy to implement in hardware using microcontrollers or FPGAs [10]--[13].
Learning long-term dependencies in segmentedmemory recurrent neural networks with backpropagation of error.
Many researches use backpropagation neural network as the simulation model [19, 21].
QR Code Augmented Reality tracking with merging on conventional marker based Backpropagation neural network, Advanced Computer Science and Information Systems (ICACSIS), Proceedings of 2012 International Conference on, pp.