Analogue synaptic noise--implications and learning improvements
- PMID: 8049804
- DOI: 10.1142/s0129065793000353
Analogue synaptic noise--implications and learning improvements
Abstract
We analyse the effects of analogue noise on the synaptic arithmetic during multilayer perceptron training by expanding the cost function to include noise-mediated penalty terms. Predictions are made in the light of these calculations which suggest that fault tolerance, generalisation ability and learning trajectory should be improved by such noise-injection. Extensive simulation experiments on two distinct classification problems substantiate the claims. The results appear to be perfectly general for all training schemes where weights are adjusted incrementally, and have wide-ranging implications for all applications, particularly those involving "inaccurate" analogue neural VLSI.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
