logo
    Comparative Study of Radial Basis Function Neural Network with Estimation of Eigenvalue in Image Using MATLAB
    3
    Citation
    5
    Reference
    10
    Related Paper
    Citation Trend
    Keywords:
    Basis (linear algebra)
    Basis function
    Matrix (chemical analysis)
    This paper proposes a cellular neural network (CNN) model with radial basis input function (radial basis input CNN) for improving function approximation ability of CNNs. The model can be viewed as a cascade of two units: the first unit is a multi-input, multi-output radial basis function network (RBFN), the second unit is the original CNN model. The weights and centers of the RBFN unit are chosen identical for all RBFN outputs yielding a space-invariant connection weight pattern over the network. With such a weight sharing property, the proposed model becomes a special kind of nonlinear B-template CNN. The ability of the radial basis input CNN model in approximation to functions as its input-(steady state) output mapping is examined on an edge detection task for noisy images. A modified version of the recurrent perceptron learning algorithm (RPLA) is used for the training radial basis input CNN.
    Cellular neural network
    Perceptron
    Basis (linear algebra)
    Activation function
    Basis function
    Citations (1)
    Radial basis function Networks (RBFNs) have been successfully employed in different Machine Learning problems. The use of different radial basis functions in RBFN has been reported in the literature. Here, we discuss the use of the q-Gaussian function as a radial basis function employed in RBFNs. An interesting property of the q-Gaussian function is that it can continuously and smoothly reproduce different radial basis functions, like the Gaussian, the Inverse Multiquadratic, and the Cauchy functions, by changing a real parameter q. In addition, we discuss the mixed use of different shapes of radial basis functions in only one RBFN. For this purpose, a Genetic Algorithm is employed to select the number of hidden neurons, width of each RBF, and q parameter of the q-Gaussian associated with each radial unit.
    Basis (linear algebra)
    Basis function
    Citations (3)
    The radial basis function network is a suitable neural network for the function approximation problem and for pattern recognition. The radial basis function network has the ability to perform local learning at each neuron, making it superior to multilayer neural networks. But the number of neurons required in order for the radial basis function network to approximate an unknown nonlinear function is not clear in advance, a fact that leads to delay and overfitting of learning. We initially proposed a competitive radial basis function network to solve these problems. The proposed competitive radial basis function network can learn effectively by means of a synapse plasticity equation that takes account of competition among the synapse weights. The competitive radial basis function network has the ability to remove redundant radial basis functions, but it cannot add necessary radial basis functions. In this paper we therefore propose an effective method for the addition of necessary neurons based on an investigation of the synapse plasticity equation. In this method, the newly added radial basis function takes over some properties of the radial basis functions that have been acquired when the learning converges. We then propose a reproductive and competitive radial basis function network, which is a neural network combining this method and the competitive radial basis function network. Simulations show that the reproductive and competitive radial basis function network can decrease the sum of the squared errors effectively in application to the function approximation problem. Our model proves able to learn faster than the usual radial basis function network. © 2000 Scripta Technica, Syst Comp Jpn, 31(13): 65–75, 2000
    Basis (linear algebra)
    Basis function
    Overfitting
    Activation function
    A novel modelling framework is proposed for constructing parsimonious and flexible radial basis function network (RBF) models. Unlike a conventional standard Gaussian kernel based RBF network, where all the basis functions have the same scale (kernel width), or each basis function has a single individual scale, the new network construction approach adopts multiscale kernels (with multiple kernel widths for each selected centre) as the basis functions to provide more flexible representations with better generalized properties for general nonlinear dynamical systems. A standard orthogonal least squares (OLS) algorithm is then applied to select significant model terms (basis functions) to obtain parsimonious models.
    Basis (linear algebra)
    Basis function
    Kernel (algebra)
    Hierarchical RBF
    Citations (0)
    The radial basis function network is a suitable neural network for the function approximation problem and for pattern recognition. The radial basis function network has the ability to perform local learning at each neuron, making it superior to multilayer neural networks. But the number of neurons required in order for the radial basis function network to approximate an unknown nonlinear function is not clear in advance, a fact that leads to delay and overfitting of learning. We initially proposed a competitive radial basis function network to solve these problems. The proposed competitive radial basis function network can learn effectively by means of a synapse plasticity equation that takes account of competition among the synapse weights. The competitive radial basis function network has the ability to remove redundant radial basis functions, but it cannot add necessary radial basis functions. In this paper we therefore propose an effective method for the addition of necessary neurons based on an investigation of the synapse plasticity equation. In this method, the newly added radial basis function takes over some properties of the radial basis functions that have been acquired when the learning converges. We then propose a reproductive and competitive radial basis function network, which is a neural network combining this method and the competitive radial basis function network. Simulations show that the reproductive and competitive radial basis function network can decrease the sum of the squared errors effectively in application to the function approximation problem. Our model proves able to learn faster than the usual radial basis function network. © 2000 Scripta Technica, Syst Comp Jpn, 31(13): 65–75, 2000
    Basis (linear algebra)
    Basis function
    Overfitting
    Activation function
    In this paper a neural network for approximating function is described. The activation functions of the hidden nodes are the Radial Basis Functions (RBF) whose parameters are learnt by a two-stage gradient descent strategy. A new growing radial basis functions-node insertion strategy with different radial basis functions is used in order to improve the net performances. The learning strategy is able to save computational time and memory space because of the selective growing of nodes whose activation functions consist of different radial basis functions. An analysis of the learning capabilities and a comparison of the net performances with other approaches have been performed. It is shown that the resulting network improves the approximation results.
    Basis (linear algebra)
    Basis function
    Activation function
    Citations (3)
    After the introduction to neural network technology as multivariable function approximation, radial basis function (RBF) networks have been studied in many different aspects in recent years. From the theoretical viewpoint, approximation and uniqueness of the interpolation is studied and it has been established that RBF network can approximate arbitrarily well any multivariate continuous function provided enough radial basis functions are employed. For the number of hidden nodes, type of radial base functions, width of the basis functions, cluster centres of the basis functions are some example issues on which numerous research works appeared in the literature. In contrast with this, however, there is remarkably only a few papers pointing out the functional approximation from the frequency domain view-point. They identify that basis functions basically behave as low pass filters. Due to this over filtering effect the RBF networks are not favourable for high frequencies unless relatively high number of hidden nodes is used. Therefore, for approximations that have only low frequency components, RBF networks provide satisfactory results and this is presumably the case in many favourable RBF applications reported in literature and vice versa. However, considering the filtering characteristics of different radial basis functions, one can improve the performance of RBF networks with mixture of radial basis functions.
    Basis (linear algebra)
    Basis function
    Interpolation
    Radial function
    Hierarchical RBF
    Citations (0)
    This paper discusses the radial basis function (RBF) neural networks used in the radar target classification. To enhance the classification rate, the structure of the modified radial basis function (MRBF) neural network is proposed. Two kinds of MRBF networks which are called the MRBF1 network and the MRBF2 network are discussed in this paper. From the theory as well as computer simulations, we find that the performance of the MRBF network is superior to the RBF network and the MRBF2 network gets higher classification rate than the MRBF1 network.
    Basis (linear algebra)
    Hierarchical RBF
    Citations (5)