Reforming Architecture and Loss Function of Artificial Neural Networks in Binary Classification Problems

2020 
Artificial neural networks (ANNs), implied models of the biological human brain's neuron, have now a long history as prime techniques in machine learning and computational intelligence with a wide range of applications, and are of great interest thanks to their great success. Classification is one of the most populous realms of research in ANNs with a vast and growing literature. Our innovation is to revolutionize loss function for ANNs, in accordance with a novel architecture for the last layer of the neural net (NN), that empowers them to apply dynamic thresholding while deciding the label of a sample based on its probability-of-belonging values; hence, to model complexities of the data more discriminatingly and attain better quantitative results. Although we established our approach through mathematical argument particularly for binary classification, the concept and formulation are entirely and purposefully generalizable to multiclass classification problems.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    10
    References
    0
    Citations
    NaN
    KQI
    []