Global exponential stability in Lagrange sense for recurrent neural networks with both time-varying delays and general activation functions via LMI approach

2011 
Abstract In this paper, we study the global exponential stability in a Lagrange sense for recurrent neural networks with both time-varying delays and general activation functions. Based on assuming that the activation functions are neither bounded nor monotonous or differentiable, several algebraic criterions in linear matrix inequality form for the global exponential stability in a Lagrange sense of the neural networks are obtained by virtue of Lyapunov functions and Halanay delay differential inequality. Meanwhile, the estimations of the globally exponentially attractive sets are given out. The results derived here are more general than that of the existing reference. Finally, two examples are given and analyzed to demonstrate our results.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    20
    References
    26
    Citations
    NaN
    KQI
    []