Random compact Gaussian kernel: Application to ELM classification and regression

2021 
Abstract Extreme learning machine (ELM) kernels using random feature mapping have recently gained a lot of popularity because they require low human supervision. However, the superiority in the mapping mechanism of ELM kernels is accompanied by a higher computation cost, rendering the kernel learning algorithms hard to tackle large scale learning tasks. On the other hand, the implicit mapping used in the conventional Gaussian kernel is computationally cheaper than the explicit computation of the ELM kernel, but requires trivial human intervention. This paper proposes to merge both properties by defining a new kernel, the random compact Gaussian (RCG) kernel. The random feature mapping property enables RCG kernel to save parameters selection time, while the implicitly mapping property enables RCG kernel to save kernel calculation time. The proposed kernel works by scaling one kernel parameter in the conventional Gaussian kernel to multiple kernel parameters, and generating all parameters randomly based on a continuous probability distribution. We prove that the RCG kernel is a Mercer kernel. The kernel is calculated implicitly before seeing the training samples and used to train ELMs. The experiments on 25 binary classification and regression benchmark problems show that the RCG kernel typically outperforms other competitive kernels. Compared to ELM kernel, RCG kernel not only achieves the better generalization performance on most datasets, but also needs much less kernel calculation cost. In addition, the sensitivity analysis of the kernel parameters of k -fold cross validation is conducted and the results show that the RCG kernel is robust and stable for repeated trials.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    31
    References
    0
    Citations
    NaN
    KQI
    []