Global Stability of Stochastic, Discrete-Time Neural Networks

M.P. Joy (UK)

Keywords

stochastic neural networks, exponential stability.

Abstract

The stability analysis of neural networks is important in the applications and has been studied by many authors. How ever, only recently has the stability of stochastic models of neural networks been investigated. In this paper we analyse the global asymptotic stability of a class of neu ral networks described by a stochastic difference equation, in fact, a Markov chain with state space Rm . If Xn is the state of the neural network at time n, we prove that under certain conditions, Xn ! 0, n ! 1, and are able to bound sample Lyapunov exponents – it turns out that our model is exponentially stable under these conditions. Our results assume neither the symmetry of the intercon nection weights, neither do we assume differentiability or monotonicity of the activation functions.

Important Links:



Go Back


IASTED
Rotating Call For Paper Image