For the first time, physicists in the US have confirmed a decades-old theory regarding the breaking of time-reversal symmetry in gauge fields. Marin Soljacic at the Massachusetts Institute of Technology and an international team of researchers have made this first demonstration of the “non-Abelian Aharonov-Bohm effect” in two optics experiments. With improvements, their techniques could find use in optoelectronics and fault-tolerant quantum computers.
First emerging in Maxwell’s famous equations for classical electrodynamics, a gauge theory is a description of the physics of fields. Gauge theories have since become an important part of physicists’ descriptions of the dynamics of elementary particles – notably the theory of quantum electrodynamics.
A salient feature of a gauge theory is that the physics it describes does not change when certain transformations are made to the underlying equations describing the system. An example is the addition of a constant scalar potential or a “curl-free” vector potential to Maxwell’s equations. Mathematically, this does not change the electric and magnetic fields that act on a charged particle such as an electron – and therefore the behaviour of the electron – so Maxwell’s theory is gauge invariant.