Estudio comparativo de los algoritmos backpropagation (bp) y multiple linear regression (mlr) a través del análisis estadístico de datos aplicado a redes neuronales artificiales

2020 
The objective of the research is to compare the Backpropagation algorithm developed by the user under free Java software and the Multiple Linear Regression algorithm, the comparison demands of the descriptive statistical analysis based on artificial neural network. Specifically, two models of prediction algorithms applied to 451 patterns or records to be processed that are distributed in the first 401 rows for training of the neural network and the other 50 records for validation and testing, consisting of 4 input variables (Height above sea level, Fall, Net fail, Flux) and 1 variable to predict (Power turbine), for the different tests the training and selection parameters with the best results are: Architecture of the neural network, Type of scaling of data, Initial range of weight and thresholds, Learning rate and Momentum, Batched / online, Number of training epochs. Among the comparison results of the algorithms analyzed, it was determined that the error in the highest iterations is less than the responses of the 50 test patterns. In the Multiple Linear Regression algorithm the real variable is the value of the variable to be predicted, this variable is the one provided to be predicted by the user and is the predicted value of the neural network, the variable prediction is the difference that is made the subtraction of the previous errors and it is done to calculate the error and the total error is the minimum value to be obtained which represents the calculated error of all the data, that is, the error percentage of the back-propagation neural network. The lower this percentage is, the better the network will be, because the lower the percentage of error.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []