Direct control method for improving stability and reliability of nonlinear stochastic dynamical systems

2020 
Abstract Optimal control for improving the stability and reliability of nonlinear stochastic dynamical systems is of great significance for enhancing system performances. However, it has not been adequately investigated because the evaluation indicators for stability (e.g. maximal Lyapunov exponent) and for reliability (e.g. mean first-passage time) cannot be explicitly expressed as the functions of system states. Here, a unified procedure is established to derive optimal control strategies for improving system stability and reliability, in which a physical intuition-inspired separation technique is adopted to split feedback control forces into conservative components and dissipative components, the stochastic averaging is then utilized to express the evaluation indicators of performances of controlled system, the optimal control strategies are finally derived by minimizing the performance indexes constituted by the sigmoid function of maximal Lyapunov exponent (for stability-based control)/the reciprocal of mean first-passage time (for reliability-based control), and the mean value of quadratic form of control force. The unified procedure converts the original functional extreme problem of optimal control into an extremum value problem of multivariable function which can be solved by optimization algorithms. A numerical example is worked out to illustrate the efficacy of the optimal control strategies for enhancing system performance.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    32
    References
    2
    Citations
    NaN
    KQI
    []