A directed batch growing approach to enhance the topology preservation of self-organizing map
2017
Display Omitted A potent batch growing approach for GSOM was proposed.Neuron insertion rules were defined to manage the map growth in proper directions.DBGSOM performs much better than GSOM and SOM in term of topology preservation. The growing self-organizing map (GSOM) possesses effective capability to generate feature maps and visualizing high-dimensional data without pre-determining their size. Most of the proposed growing SOM algorithms use an incremental learning strategy. The conventional growing approach of GSOM is based on filling all available position around the candidate neuron which can decrease the topology preservation quality of the map due to the misconfiguration and twisting of the map which could be a consequence of unexpected network growth and improper neuron addition and weight initialization. To overcome this problem, in this paper we introduce a batch learning strategy for growing self-organizing maps called DBGSOM which direct the growing process based on the accumulative error around the candidate boundary neuron. In the proposed growing approach, just one new neuron is added around each candidate boundary neuron. The DBGSOM offers suitable mechanisms to find a proper growing positions and allocating initial weight vectors for the new neurons.The potential of the DBGSOM was investigated with one synthetic dataset and six real-world benchmark datasets in terms of topology preservation and mapping quality. Experimental results showed that the proposed growing strategy provides an enhanced topology preserved map and reduces the susceptibility of twisting compared to GSOM. Furthermore, the proposed method has a better clustering ability than GSOM and SOM. According to the lower number of neurons generated by DBGSOM, it needs less time to learn the manifold of the data points compared to GSOM.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
46
References
8
Citations
NaN
KQI