Aitchison, L;
Jegminat, J;
Menendez, JA;
Pfister, J-P;
Pouget, A;
Latham, PE;
(2021)
Synaptic plasticity as Bayesian inference.
Nature Neuroscience
, 24
pp. 565-571.
10.1038/s41593-021-00809-5.
Preview |
Text
prob_synapses.pdf - Accepted Version Download (2MB) | Preview |
Abstract
Learning, especially rapid learning, is critical for survival. However, learning is hard; a large number of synaptic weights must be set based on noisy, often ambiguous, sensory information. In such a high-noise regime, keeping track of probability distributions over weights is the optimal strategy. Here we hypothesize that synapses take that strategy; in essence, when they estimate weights, they include error bars. They then use that uncertainty to adjust their learning rates, with more uncertain weights having higher learning rates. We also make a second, independent, hypothesis: synapses communicate their uncertainty by linking it to variability in postsynaptic potential size, with more uncertainty leading to more variability. These two hypotheses cast synaptic plasticity as a problem of Bayesian inference, and thus provide a normative view of learning. They generalize known learning rules, offer an explanation for the large variability in the size of postsynaptic potentials and make falsifiable experimental predictions.
Archive Staff Only
![]() |
View Item |