Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback
- PMID: 28743932
- PMCID: PMC5527101
- DOI: 10.1038/s41467-017-00181-8
Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback
Abstract
Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.
Conflict of interest statement
The authors declare no competing financial interests.
Figures
References
-
- Hillis, J. M., Watt, S. J., Landy, M. S., Banks, M. S. Slant from texture and disparity cues: optimal cue combination. J. Vis. 4, 967-92 (2004). - PubMed
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
Other Literature Sources
