Stability of Internal States in Recurrent Neural Networks Trained on Regular Languages

2021 
Abstract We provide an empirical study of the stability of recurrent neural networks trained to recognize regular languages. Noise is used to force the recurrent neurons into saturation. In the saturated regime, analysis of the network activation reveals the formation of clusters that resemble discrete states in a finite state machine. We demonstrate that transitions between these activation clusters in response to input symbols are deterministic and stable. The networks display a stationary behavior for arbitrarily long strings and, when random perturbations are applied, they can recover, and their evolution converges to the original clusters. This observation reinforces the interpretation of the networks as finite automata, with neurons or groups of neurons coding specific and meaningful input patterns.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    31
    References
    0
    Citations
    NaN
    KQI
    []