In 1972 Phil Andersen articulated the motto of condensed matter physics as “More is different.” However, for most condensed matter systems many more is quite similar to more—this is why computer simulations of relatively small systems give insight into far larger systems. There are, however, systems in which many more is different. For example, the capabilities of artificial neural networks grow with their size. Unfortunately, so does the time and energy required to train them. By contrast, brains learn and perform an enormous variety of tasks on their own, using relatively little energy. Brains are able to accomplish this without an external computer because their analog constituent parts (neurons) update their connections without knowing what all the other neurons are doing using local rules. We have developed an approach to learning that shares the property that analog constituent parts update their properties via a local rule, but does not otherwise emulate the brain. Instead, we exploit physics to learn in a far simpler way. Our collaborators have implemented this approach in the lab, developing physical systems that learn and perform machine learning tasks on their own with little energy cost. These systems should open up the opportunity to study how many more is different within a new paradigm for scalable learning.
Physical systems that learn by themselves
Host: Sid Goyal