Two novel finite time convergent recurrent neural networks for tackling complex-valued systems of linear equation
2020
Compared to the linear activation function, a suitable nonlinear activation
function can accelerate the convergence speed. Based on this finding, we
propose two modified Zhang neural network (ZNN) models using different
nonlinear activation functions to tackle the complex-valued systems of linear
equation (CVSLE) problems in this paper. To fulfill this goal, we first
propose a novel neural network called NRNN-SBP model by introducing the
sign-bi-power activation function. Then, we propose another novel neural
network called NRNN-IRN model by introducing the tunable activation function.
Finally, simulative results demonstrate that the convergence speed of
NRNN-SBP and the NRNN-IRN is faster than that of the FTRNN model. On the
other hand, these results also reveal that different nonlinear activation
function will have a different effect on the convergence rate for different
CVSLE problems.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
27
References
0
Citations
NaN
KQI