A New Robust Weight Update for Neural-Network Control

C.J.B. Macnab (Canada)


Neural Network Control, Adaptive Control, Robust Con trol, Robotics


This work addresses the problem of weight drift in direct adaptive control of underdamped systems using a multi layer perceptron (backpropagation network). When the number of hidden units in the neural network is small, the modeling error causes weight drift. Weight drift can cause the state error to jump suddenly to large values. The com monly used robust weight update method of e-modification will only halt the weight drift if the performance is sac rificed. This paper proposes a new method that can pre vent weight drift without sacrificing performance. A set of alternate weights is trained online which are capable of producing the same output as the original weights. The de sign of the weight update laws keeps the original weights from drifting far from these alternate weights. A Lyapunov analysis proves the semi-global uniform ultimate bounded ness of all signals. An experimental simulation, trajectory tracking of a two-link flexible joint robot, illustrates the improvement in performance compared to e-modification. The new method does not require a trade-off of perfor mance to prevent weight drift and the resulting large jumps in error.

Important Links:

Go Back