Contrastive learning in tunable dynamical systems
Menachem Stern, Adam G. Frim, Raúl Candás, Andrea J. Liu, Vijay Balasubramanian
Abstract
We generalize the theory of supervised contrastive learning, previously applied to physical systems at equilibrium or steady state, to systems following any dynamics described by coupled ordinary differential equations. We show that if physical dynamics break time reversal symmetry, gradient descent on a cost function embodying the desired behavior cannot be achieved with a scalable process, even in principle. We therefore introduce Probably Approximately Right (PAR) learning processes, composed of a local contrastive learning rule and a scalable supervision protocol. We show that approximate, local supervision with forward propagation of the error signal can be used to successfully train several tunable models of physical dynamics inspired by examples in biological and machine learning.
