CLF networks with dynamic attention phase

1996 
We introduce the notion of dynamic nodal allocation of receptor neurons in CLF (conjunctions of localised features) networks. The attention phase is modified to create new receptor neurons for each class and input region only if no existing receptor neuron is activated by the current input. The generalization phase then utilizes backpropagation between the middle and output layers only to resolve interclass ambiguities. The power of the network is demonstrated on the problem of handwritten numeral recognition. To test and improve the results, several experiments were used. The learning algorithm seems to be robust enough for a larger training set including more classes of symbols, and/or a wider range of writing styles.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    6
    References
    0
    Citations
    NaN
    KQI
    []