Skip to main content
placeholder image

Exponential stability of a neural network for bound-constrained quadratic optimisation

Conference Paper


Abstract


  • A recurrent neural network is presented which performs quadratic optimisation subject to bound constraints on each of the optimisation variables. The optimisation strategy employed by the neural network falls into the general class of gradient methods for constrained nonlinear optimisation, and is compared briefly with the strategies employed by conventional techniques for bound-constrained quadratic optimisation. Conditions on the quadratic problem and the network parameters are established under which exponential asymptotic stability is achieved. These conditions are shown to provide a tighter bound on the degree of exponential stability than that previously established for this network. Through suitable choice of the network parameters, the system of differential equations governing the network activations is preconditioned in order to reduce its sensitivity to noise and roundoff-errors and to accelerate convergence.

Publication Date


  • 1994

Citation


  • Bouzerdoum, A., & Pattison, T. R. (1994). Exponential stability of a neural network for bound-constrained quadratic optimisation. In IEEE International Conference on Neural Networks - Conference Proceedings Vol. 2 (pp. 918-923).

Scopus Eid


  • 2-s2.0-0028733919

Start Page


  • 918

End Page


  • 923

Volume


  • 2

Abstract


  • A recurrent neural network is presented which performs quadratic optimisation subject to bound constraints on each of the optimisation variables. The optimisation strategy employed by the neural network falls into the general class of gradient methods for constrained nonlinear optimisation, and is compared briefly with the strategies employed by conventional techniques for bound-constrained quadratic optimisation. Conditions on the quadratic problem and the network parameters are established under which exponential asymptotic stability is achieved. These conditions are shown to provide a tighter bound on the degree of exponential stability than that previously established for this network. Through suitable choice of the network parameters, the system of differential equations governing the network activations is preconditioned in order to reduce its sensitivity to noise and roundoff-errors and to accelerate convergence.

Publication Date


  • 1994

Citation


  • Bouzerdoum, A., & Pattison, T. R. (1994). Exponential stability of a neural network for bound-constrained quadratic optimisation. In IEEE International Conference on Neural Networks - Conference Proceedings Vol. 2 (pp. 918-923).

Scopus Eid


  • 2-s2.0-0028733919

Start Page


  • 918

End Page


  • 923

Volume


  • 2