288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

January 2007, Vol. 19, No. 1, Pages 194-217
(doi: 10.1162/neco.2007.19.1.194)
© 2006 Massachusetts Institute of Technology
Free-Lunch Learning: Modeling Spontaneous Recovery of Memory
Article PDF (1.23 MB)

After a language has been learned and then forgotten, relearning some words appears to facilitate spontaneous recovery of other words. More generally, relearning partially forgotten associations induces recovery of other associations in humans, an effect we call free-lunch learning (FLL). Using neural network models, we prove that FLL is a necessary consequence of storing associations as distributed representations. Specifically, we prove that (1) FLL becomes increasingly likely as the number of synapses (connection weights) increases, suggesting that FLL contributes to memory in neurophysiological systems, and (2) the magnitude of FLL is greatest if inactive synapses are removed, suggesting a computational role for synaptic pruning in physiological systems. We also demonstrate that FLL is different from generalization effects conventionally associated with neural network models. As FLL is a generic property of distributed representations, it may constitute an important factor in human memory.