A kernelized maximal-figure-of-merit learning approach based on subspace distance minimization
2011
We propose a kernelized maximal-figure-of-merit (MFoM) learning approach to efficiently training a nonlinear model using subspace distance minimization. In particular, a fixed, small number of training samples are chosen in a way that the distance between function spaces constructed with a subset of training samples and with the entire training data set is minimized. This construction of the subset enables us to learn a nonlinear model efficiently while keeping the resulting model nearly optimal compared to the model from the whole training data set. We show that the subspace distance can be minimized through the Nystrom extension. Experimental results on various machine learning problems demonstrate clear advantages of the proposed technique over the case where the function space is built with randomly selected training samples. Additional comparisons with the model trained with the entire training samples show that the proposed technique achieves comparable results while reducing training time tremendously.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
15
References
2
Citations
NaN
KQI