Structured Radial Basis Function Network: Modelling Diversity for Multiple Hypotheses Prediction
0
Citation
0
Reference
10
Related Paper
Abstract:
Multi-modal regression is important in forecasting nonstationary processes or with a complex mixture of distributions. It can be tackled with multiple hypotheses frameworks but with the difficulty of combining them efficiently in a learning model. A Structured Radial Basis Function Network is presented as an ensemble of multiple hypotheses predictors for regression problems. The predictors are regression models of any type that can form centroidal Voronoi tessellations which are a function of their losses during training. It is proved that this structured model can efficiently interpolate this tessellation and approximate the multiple hypotheses target distribution and is equivalent to interpolating the meta-loss of the predictors, the loss being a zero set of the interpolation error. This model has a fixed-point iteration algorithm between the predictors and the centers of the basis functions. Diversity in learning can be controlled parametrically by truncating the tessellation formation with the losses of individual predictors. A closed-form solution with least-squares is presented, which to the authors’ knowledge, is the fastest solution in the literature for multiple hypotheses and structured predictions. Superior generalization performance and computational efficiency is achieved using only two-layer neural networks as predictors controlling diversity as a key component of success. A gradient-descent approach is introduced which is loss-agnostic regarding the predictors. The expected value for the loss of the structured model with Gaussian basis functions is computed, finding that correlation between predictors is not an appropriate tool for diversification. The experiments show outperformance with respect to the top competitors in the literature.Keywords:
Basis (linear algebra)
We use the radial basis function (RBF) network to approximate the fitness function of genetic algorithms and try to obtain approximate optimum results. A RBF is a kind of neural network that is composed of a number of radial basis functions in a Gaussian distribution. When the positions of the basis functions and their radii are given, the learning system of the RBF is summarized in the calculation of the inverse matrix. Thus, the learning system is quite simple and very rapid. There are two important issues in a RBF: one is to place the basis functions and to give data, and the other is to give an appropriate radius to each radial basis function. For the first issue, we have proposed a data distribution and basis function distribution method, together with adaptive-range genetic algorithms (ARange GAs). In this study, we focus our attention on the second problem. For that purpose, we give an oval distribution for the basis functions and assume that every basis function has the same oval radius. In this way, we can reduce the number of radius functions as we reduce the number of design variables. We show the effectiveness of the proposed method through a benchmark problem.
Basis (linear algebra)
Basis function
Benchmark (surveying)
Cite
Citations (5)
Basis (linear algebra)
Basis function
Contextual image classification
Cite
Citations (20)
This paper presents the work regarding the implementation of radial basis function algorithm on very high speed integrated circuit hardware description language by using Perceptron learning. Neural Network hardware is usually defined as those devices designed to implement neural architectures and learning algorithms. The radial basis function (RBF) network is a two-layer network whose output units form a linear combination of the basis function computed by the hidden unit & hidden unit function is a Gaussian. The radial basis function has a maximum of 1 when its input is 0. As the distance between weight vector and input decreases, the output increases. Thus, a radial basis neuron acts as a detector that produces 1 whenever the input is identical to its weight vector.
Basis (linear algebra)
Perceptron
Activation function
Basis function
Cite
Citations (0)
Development of the radial basis function networks (RBFNs) can be divided into two stages. First, learning the centres and widths of the radial basis function and next, learning the connection weight. The performance of the RBFN depends entirely on these two learning algorithms. Hence, in this paper, we proposed a new algorithm wherein the centres and widths of the radial basis function in regression problem are selected using the Taguchi method. Some experiments of function estimation are conducted in order to illustrate the performance of the proposed algorithm.
Basis (linear algebra)
Basis function
Cite
Citations (1)
This paper introduces the radial basis function (RBF) network in the seismic data processing, and realizes the inserting data in seismic data processing with function approximation method. This method brought satisfactory result when applied to real seismic data.The dissertation mainly studies the theory, methods, applications of radial basis function networks and the approximation capability. The network can sufficiently utilize the information contained in the training data, choose the centers of radial basis functions and the weights of networks one by one until an adequate network has been constructed, providing a simple and efficient means for growing radial basis function networks. Therefore, radial basis function network is a sort of new-fashioned neural network with wide application foreground.
Basis (linear algebra)
Data Processing
Basis function
Function Approximation
Cite
Citations (1)
The radial basis function network is a suitable neural network for the function approximation problem and for pattern recognition. The radial basis function network has the ability to perform local learning at each neuron, making it superior to multilayer neural networks. But the number of neurons required in order for the radial basis function network to approximate an unknown nonlinear function is not clear in advance, a fact that leads to delay and overfitting of learning. We initially proposed a competitive radial basis function network to solve these problems. The proposed competitive radial basis function network can learn effectively by means of a synapse plasticity equation that takes account of competition among the synapse weights. The competitive radial basis function network has the ability to remove redundant radial basis functions, but it cannot add necessary radial basis functions. In this paper we therefore propose an effective method for the addition of necessary neurons based on an investigation of the synapse plasticity equation. In this method, the newly added radial basis function takes over some properties of the radial basis functions that have been acquired when the learning converges. We then propose a reproductive and competitive radial basis function network, which is a neural network combining this method and the competitive radial basis function network. Simulations show that the reproductive and competitive radial basis function network can decrease the sum of the squared errors effectively in application to the function approximation problem. Our model proves able to learn faster than the usual radial basis function network. © 2000 Scripta Technica, Syst Comp Jpn, 31(13): 65–75, 2000
Basis (linear algebra)
Basis function
Overfitting
Activation function
Cite
Citations (0)
This paper discusses the rationale for employing alternative basis functions to the ubiquitous Gaussian in radial basis function networks. In particular the author concentrates upon employing unbounded basis functions (though the network as a whole remains bounded), and non-positive definite basis functions. The use of unbounded and non-positive basis functions, though counterintuitive in application domains such as classification and time series forecasting, have a good theoretical motivation from the domains of functional interpolation and kernel based density estimation. The use of non-Gaussian radial basis function networks is demonstrated on real world data.< >
Basis (linear algebra)
Basis function
Interpolation
Counterintuitive
Kernel (algebra)
Kernel density estimation
Cite
Citations (2)
In this paper, we use the radial basis function network in order to approximate the fitness function of the genetic algorithms and try to obtain the approximate optimum results within the relatively small number of function call. The radial basis function networks (RBF) is a kind of neural network that is composed by the number of radial basis function in Gaussian distribution. RBF has learning system that is composed by additional learning of a basis function and a new data and forgetting of a basis function and an undesirable data. Thus the key issues in RBF are to give new data and to place basis function. So that if we can give these values appropriately, we can carry out approximate optimization even in the case that the optimum solutions are outside the range of the initial settings. Together with the adaptive range genetic algorithms that are proposed to treat mixed variable optimization, we will propose the way to give a new data and basis function. In this study, we have shown the effectiveness of the proposed method through simple numerical examples in unconstrained optimum design case.
Basis (linear algebra)
Basis function
Cite
Citations (7)
Radial basis functions can be combined into a network structure that has several advantages over conventional neural network solutions. However, to operate effectively the number and positions of the basis function centres must be carefully selected. Although no rigorous algorithm exists for this purpose, several heuristic methods have been suggested. In this paper a new method is proposed in which radial basis function centres are selected by the mean-tracking clustering algorithm. The mean-tracking algorithm is compared with k means clustering and it is shown that it achieves significantly better results in terms of radial basis function performance. As well as being computationally simpler, the mean-tracking algorithm in general selects better centre positions, thus providing the radial basis functions with better modelling accuracy.
Basis (linear algebra)
Tracking (education)
Basis function
Cite
Citations (31)
This paper discusses the radial basis function (RBF) neural networks used in the radar target classification. To enhance the classification rate, the structure of the modified radial basis function (MRBF) neural network is proposed. Two kinds of MRBF networks which are called the MRBF1 network and the MRBF2 network are discussed in this paper. From the theory as well as computer simulations, we find that the performance of the MRBF network is superior to the RBF network and the MRBF2 network gets higher classification rate than the MRBF1 network.
Basis (linear algebra)
Hierarchical RBF
Cite
Citations (5)