Predictions Tasks with Words and Sequences: Comparing a Novel Recurrent Architecture with the Elman Network

2011 
The classical connectionist models are not well suited to working with data varying over time. According to this, temporal connectionist models have emerged and constitute a continuously growing research field. In this paper we present a novel supervised recurrent neural network architecture (SARASOM) based on the Associative Self-Organizing Map (A-SOM). The A-SOM is a variant of the Self-Organizing Map (SOM) that develops a representation of its input space as well as learns to associate its activity with an arbitrary number of additional inputs. In this context the A-SOM learns to associate its previous activity with a delay of one iteration. The performance of the SARASOM was evaluated and compared with the Elman network in a number of prediction tasks using sequences of letters (including some experiments with a reduced lexicon of 10 words). The results are very encouraging with SARASOM learning slightly better than the Elman network. (Less)
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []