Files

Abstract

Most materials are changed by their history and show memory of things past. However, it is not clear when a system can continually learn new memories in sequence, without interfering with or entirely overwriting earlier memories. Here, we study the learning of multiple stable states in sequence by an elastic material that undergoes plastic changes as it is held in different configurations. We show that an elastic network with linear or nearly linear springs cannot learn continually without overwriting earlier states for a broad class of plasticity rules. On the other hand, networks of sufficiently nonlinear springs can learn continually, without erasing older states, using even simple plasticity rules. We trace this ability to cusped energy contours caused by strong nonlinearities and thus show that elastic nonlinearities play the role of Bayesian priors used in sparse statistical regression. Our model shows how specific material properties allow continual learning of new functions through deployment of the material itself.

Details

Actions

PDF

from
to
Export
Download Full History