Learning distant cause and effect using only local and immediate credit assignment

2021 
We present a recurrent neural network memory that uses sparse coding to create a combinatoric encoding of sequential inputs. The network is trained using only local and immediate credit assignment. Despite this constraint, results are comparable to networks trained using deep backpropagation or BackProp Through Time (BPTT). With several examples, we show that the network can associate distant cause and effect in a discrete stochastic process, predict partially-observable higherorder sequences, and learn to generate many time-steps of video simulations. Typical memory consumption is 10-30x less than conventional RNNs, such as LSTM, trained by BPTT. One limitation of the memory is generalization to unseen input sequences. We additionally explore this limitation by measuring next-word prediction perplexity on the Penn Treebank dataset.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    36
    References
    0
    Citations
    NaN
    KQI
    []