Handwritten Word Recognition by Multiple Classifiers: A Divide-and-Conquer Approach

2011 
In this paper, combining multiple classifiers based on the Mixture of Experts investigated, as Multi Layer Perceptrons, MLPs, is used as gating and expert networks. We call this method Mixture of Multilayer Perceptron Experts, MOME. Large experiments established upon combining classifiers on Persian handwritten words are reported and discussed. Here, to evaluate our proposed model we use a real-world database: Iranshahr. The experimental results using our database support claim that implementing a mixture of some simple MLPs improves the performance, in which the best result of our proposed model is 90.50%, demonstrating 60% decline of error rate with regards to single MLPs. Furthermore, comparison test with other combination methods indicates that the proposed model yields excellent recognition rate in handwritten word recognition.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []