Tightening the Biological Constraints on Gradient-Based Predictive Coding

2021 
Predictive coding (PC) is a general theory of cortical function. The local, gradient-based learning rules found in one kind of PC model have been shown to closely approximate back-propagation under certain conditions [18, 26, 32]. This finding suggests that this PC model may be useful for understanding how the brain solves the credit assignment problem. The model may also be useful for developing local learning algorithms that are compatible with neuromorphic hardware. In this paper, we modify this PC model so that it better fits biological constraints, including the constraints that neurons can only have positive firing rates and the constraint that synapses only flow in one direction. We also compute the gradient-based weight and activity updates given the modified activity values. We show that, under certain conditions, these modified PC networks perform as well or nearly as well on MNIST data as the unmodified PC model and networks trained with back-propagation.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    24
    References
    0
    Citations
    NaN
    KQI
    []