Noise correlations for faster and more robust learning.

2020 
Distributed population codes are ubiquitous in the brain and pose a challenge to downstream neurons that must learn an appropriate readout. Here we explore the possibility that this learning problem is simplified through inductive biases implemented through stimulus-independent noise correlations that constrain learning to task-relevant dimensions. We test this idea with a neural network model of a perceptual discrimination task in which the correlation among similarly tuned units can be manipulated independently of overall population signal-to-noise ratio. Higher noise correlations among similarly tuned units led to faster and more robust learning favoring homogenous weights assigned to neurons within a functionally similar pool and could emerge naturally with Hebbian learning. When multiple discriminations were learned simultaneously, noise correlations across relevant feature dimensions sped learning whereas those across irrelevant feature dimensions slowed it. These results suggest that noise correlations may serve constrain learning to appropriate dimensions to optimize readout learning.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    65
    References
    0
    Citations
    NaN
    KQI
    []