Files
Abstract
A distinguishing feature of the neural network models used in Physics and Chemistry is that they must obey basic underlying symmetries, such as symmetry to translations, rotations, and the exchange of identical particles. Over the course of the last several years, the artificial neural networks community has developed a class of networks called group-equivariant neural nets that can efficiently “bake-in” such symmetries into the structure of the network itself. Equivariant neural nets leverage ideas from group representation theory and express all variables in the generalized Fourier space corresponding to the underlying group. In this article, we review this formalism and derive the general form of operations allowable in equivariant neural networks. Specifically, we discuss why the Clebsch–Gordan transform appears in such architectures, and how it can play the role of an equivariant nonlinearity.