Affine Transformation Based Hierarchical Extreme Learning Machine

2020 
Recently, the signal hidden layer feedforward network (SLFN) based extreme learning machine (ELM) has been extended to a hierarchical learning framework (HELM). Although the HELM shows better generalization performance with lower computational complexity than many deep neural networks (DNNs), it is found that as the layer increases, the input distribution of each layer may move to the saturated regime of the non-linear activation function, which affects the generalization performance. However, few attentions have been paid to the data distribution normalization to address this issue. Thus, in this paper, an affine transformation (AT) inputs based activation function layer is introduced to normalize the data distribution and a novel AT based HELM (AT-HELM) is developed. The proposed AT-HELM can adapt the activation function inputs to the distribution of each layer and obtains better generalization performance. Experiments on 29 benchmark datasets are carried out to demonstrate the superiority of AT-HELM.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    10
    References
    0
    Citations
    NaN
    KQI
    []