On the Initialization of Recurrent Backpropagation Neural Networks

A. Toledo, E. Irigoyen, and M. Pinzolas (Spain)


Neural networks, initialization, stability.


In this work, the initialization of recurrent backpropagation neural networks is studied, both in open and closed loop schemes. The selection of adequate ranges for the initial weights is related with the stability of the network in its initial stage. As a result, quantitative limits for the initial weights are established, that guarantee stability and speed the learning process. The theoretical developments have been tested in experiments that statistically corroborate the improvements achieved with the proposed initialization methods.

Important Links:

Go Back