Design techniques for the control of errors in backpropagation neural networks
1993
A significant problem in the design and construction of an artificial neural network for function approximation is limiting the magnitude and variance of errors when the network is used in the field. Network errors can occur when the training data does not faithfully represent the required function due to noise or low sampling rates, when the network's flexibility does not match the variability of the data, or when the input data to the resultant network is noisy. This paper reports on several experiments whose purpose was to rank the relative significance of these error sources and thereby find neural network design principles for limiting the magnitude and variance of network errors.© (1993) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.
Keywords:
- Time delay neural network
- Types of artificial neural networks
- Stochastic neural network
- Physical neural network
- Recurrent neural network
- Rectifier (neural networks)
- Machine learning
- Artificial intelligence
- Computer science
- Probabilistic neural network
- Nervous system network models
- Multilayer perceptron
- Data mining
- Feedforward neural network
- Catastrophic interference
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
0
References
1
Citations
NaN
KQI