“Think what we can do if we teach a quantum computer to do statistical mechanics,” posed Michael McGuigan, a computational scientist with the Computational Science Initiative at the U.S. Department of Energy’s Brookhaven National Laboratory.
At the time, McGuigan was reflecting on Ludwig Boltzmann and how the renowned physicist had to vigorously defend his theories of statistical mechanics. Boltzmann, who proffered his ideas about how atomic properties determine physical properties of matter in the late 19th century, had one extraordinarily huge hurdle: atoms were not even proven to exist at the time. Fatigue and discouragement stemming from his peers not accepting his views on atoms and physics forever haunted Boltzmann.
Today, Boltzmann’s factor, which calculates the probability that a system of particles can be found in a specific energy state relative to zero energy, is widely used in physics. For example, Boltzmann’s factor is used to perform calculations on the world’s largest supercomputers to study the behavior of atoms, molecules, and the quark “soup” discovered using facilities such as the Relativistic Heavy Ion Collider located at Brookhaven Lab and the Large Hadron Collider at CERN.