Implementation of the RBF neural chip with the back-propagation algorithm for on-line learning

2015 
This paper presents the hardware implementation of the floating-point processor (FPP).Radial basis function (RBF) neural network is developed on FPGA.FPP is designed to implement the back-propagation algorithm in detail.The on-line learning process of the RBF chip is compared numerically with the results of the MATLAB program.The performance of the designed RBF neural chip is tested for the real-time pattern classification of the XOR logic.Performances are evaluated by comparing results from the MATLAB through extensive experimental studies. This article presents the hardware implementation of the floating-point processor (FPP) to develop the radial basis function (RBF) neural network for the general purpose of pattern recognition and nonlinear control. The floating-point processor is designed on a field programmable gate array (FPGA) chip to execute nonlinear functions required in the parallel calculation of the back-propagation algorithm. Internal weights of the RBF network are updated by the online learning back-propagation algorithm. The on-line learning process of the RBF chip is compared numerically with the results of the RBF neural network learning process written in the MATLAB program. The performance of the designed RBF neural chip is tested for the real-time pattern classification of the XOR logic. Performances are evaluated by comparing results from the MATLAB through extensive experimental studies.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    33
    References
    22
    Citations
    NaN
    KQI
    []