Convergence analysis of the weighted state space search algorithm for recurrent neural networks
2014
Recurrent neural networks (RNNs) have emerged as a promising tool in modeling nonlinear dynamical systems. The convergence is one of the most important issues of concern among the dynamical properties for the RNNs in practical applications. The reason is that the viability of many applications of RNNs depends on their convergence properties. We study in this paper the convergence properties of the weighted state space search algorithm (WSSSA) -- a derivative-free
and non-random learning algorithm which searches the neighborhood of the target trajectory in the state space instead of the parameter space. Because there is no computation of partial derivatives involved, the WSSSA has a couple of salient features such as simple,
fast and cost effective. In this study we provide a necessary and
sufficient condition that required for the convergence of the WSSSA. Restrictions are offered that may help assure convergence of the of
the WSSSA to the desired solution. The asymptotic rate of
convergence is also analyzed. Our study gives insights into the
problem and provides useful information for the actual design of the
RNNs. A numerical example is given to support the theoretical
analysis and to demonstrate that it is applicable to many
applications.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
19
References
0
Citations
NaN
KQI