Approximation of Function by Adaptively Growing Radial Basis Function Neural Networks
3
Citation
4
Reference
20
Related Paper
Abstract:
In this paper a neural network for approximating function is described. The activation functions of the hidden nodes are the Radial Basis Functions (RBF) whose parameters are learnt by a two-stage gradient descent strategy. A new growing radial basis functions-node insertion strategy with different radial basis functions is used in order to improve the net performances. The learning strategy is able to save computational time and memory space because of the selective growing of nodes whose activation functions consist of different radial basis functions. An analysis of the learning capabilities and a comparison of the net performances with other approaches have been performed. It is shown that the resulting network improves the approximation results.Keywords:
Basis (linear algebra)
Basis function
Activation function
Cite
A traditional radial basis function (RBF) network takes Gaussian functions as its basis functions and adopts the least squares (LS) criterion as the objective function. However, it is difficult to use Gaussian functions to approximate constant values. If a function has nearly constant values in some intervals, the RBF network will be found inefficient in approximating these values. In this paper an RBF network which uses composite of sigmoidal functions to replace the Gaussian functions as the basis function of the network is proposed. It is also illustrated that the shape of the activation function can be constructed to be a similar rectangular or Gaussian function. Thus, the constant-valued functions can be approximated accurately by an RBF network. A robust objective function is also adopted in the network to replace the LS objective function. Experimental results demonstrated that the proposed network has better capability of approximation to underlying functions with a fast learning speed and high robustness to outliers.
Sigmoid function
Basis function
Function Approximation
Activation function
Robustness
Constant (computer programming)
Hierarchical RBF
Constant function
Basis (linear algebra)
Cite
Citations (9)
Basis (linear algebra)
Basis function
Cite
Citations (0)
A new type of general radial basis function neural network is proposed,and its training method is investigated.Unlike the traditional three-layer RBF network,the basis function output weight layer is added,and super curve is used to approximate any nonlinear curve surface.The simulation results of a function approximation show that compared with the traditional RBF neural network,this net-work has better approximation performance and faster convergence,and can approximate any multivariable non-linear functions.
Shape Optimization
Basis (linear algebra)
Basis function
Activation function
Cite
Citations (0)
Function approximation is a widely used method in system identification and recently RBF networks have been proposed as powerful tools for that. Existing algorithms suffer from some restrictions such as slow convergence and/or encountering to bias in parameter convergence. This paper is an attempt to improve the above problems by proposing new methods of parameter initializing and post-training to reach better capabilities in learning time and desired precision compared to previous RBF networks.
Initialization
Function Approximation
Identification
Cite
Citations (0)
Basis (linear algebra)
Basis function
Cite
Citations (1)
Presents a systematic approach for constructing reformulated radial basis function (RBF) neural networks, which was developed to facilitate their training by supervised learning algorithms based on gradient descent. This approach reduces the construction of radial basis function models to the selection of admissible generator functions. The selection of generator functions relies on the concept of the blind spot, which is introduced in the paper. The paper also introduces a new family of reformulated radial basis function neural networks, which are referred to as cosine radial basis functions. Cosine radial basis functions are constructed by linear generator functions of a special form and their use as similarity measures in radial basis function models is justified by their geometric interpretation. A set of experiments on a variety of datasets indicate that cosine radial basis functions outperform considerably conventional radial basis function neural networks with Gaussian radial basis functions. Cosine radial basis functions are also strong competitors to existing reformulated radial basis function models trained by gradient descent and feedforward neural networks with sigmoid hidden units.
Radial function
Basis (linear algebra)
Basis function
Activation function
Sigmoid function
Cite
Citations (123)
In this paper, a fully complex radial basis function (FC-RBF) network and a gradient descent learning algorithm are presented. Many complex-valued RBF learning algorithms have been presented in the literature using a split-complex network which uses a real activation function in the hidden layer, i.e., the activation function in these network maps C n rarr R. Hence these algorithms do not consider the influence of phase change explicitly and hence do not approximate phase accurately. In this paper, a Gaussian like fully complex activation function sech(.) (C n rarr C) and a well defined gradient descent learning algorithm are developed for a FC-RBF network using sech(.) as activation function. The performance evaluation of the FC-RBF network has been carried out with two synthetic complex-valued function approximation problems, a complex XOR (C-XOR) problem and a non-minimum phase equalization problem. The results indicate the better performance of the FC-RBF network compared to the existing split complex RBF network methods.
Function Approximation
Cite
Citations (14)
In this paper, we propose the initial optimized structure of radial basis function networks that is simple and rapidly converges. We construct the hidden node with radial basis functions; their localization is similar to an approximation target function in the plane of time and frequency. We finally make a good decision for the initial structure for function approximation using a genetic algorithm.
Basis (linear algebra)
Function Approximation
Cite
Citations (1)
A new radial basis function (RBF) network training procedure that employs a linear projection technique along parameter search is proposed. To be applied simultaneously with the conventional center and/or weight adjustment methods, a gradient descent iteration on the width parameters of RBF units is introduced. The projection mechanism used by the procedure avoids negative width parameters and enables detection of redundant units, which can then be pruned from the network. Proposed training approach is applied to design a feedback neuro-controller for a nonlinear plant to track a desired trajectory.
Basis (linear algebra)
Descent (aeronautics)
Projection method
Basis function
Cite
Citations (6)
This paper proposes supervised learning algorithms based on gradient descent for training reformulated radial basis function (RBF) neural networks. Such RBF models employ radial basis functions whose form is determined by admissible generator functions. RBF networks with Gaussian radial basis functions are generated by exponential generator functions. A sensitivity analysis provides the basis for selecting generator functions by investigating the effect of linear, exponential and logarithmic generator functions on gradient descent learning. Experiments involving reformulated RBF networks indicate that the proposed gradient descent algorithms guarantee fast learning and very satisfactory function approximation capability.
Basis (linear algebra)
Basis function
Hierarchical RBF
Activation function
Cite
Citations (18)