Semi-explicit mixture of experts based on information table
2021
Mixture of experts (ME), as one of the most popular neural network-based ensemble learning methods, consists of a number of experts and a gating network. ME is based on the divide and conquer strategy in which the input space is decomposed into some subspaces by managing the gating network. Moreover, the experts are encouraged to specialize in these subspaces. In this paper, a hybrid ensemble system based on the ME method, which is named semi-explicit mixture of experts (SEME) is proposed which consists of two steps. In the first step, a greedy algorithm is proposed which aggregates the randomness and heuristic properties to decompose the input space to the some local subspaces and consider a center corresponding to each subspace. In the second step, the ME algorithm by applying a distance-based gating network is used to encourage the experts into the created subspaces and keep the interaction and cooperation between the experts. The proposed method is evaluated on 19 classification benchmark datasets. Based on the simulation results, the average of improvements in the classification results on these datasets shows that SEME improves $$5.45\%$$
, $$8.46\%$$
, $$5.52\%$$
, $$5.4\%$$
and $$3.95\%$$
better than Bagging, AdaBoost, random forests (RF), forests of local trees (FLT) and ME, respectively.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
35
References
0
Citations
NaN
KQI