Toggle light / dark theme

While Einstein’s theory of general relativity can explain a large array of fascinating astrophysical and cosmological phenomena, some aspects of the properties of the universe at the largest-scales remain a mystery. A new study using loop quantum cosmology—a theory that uses quantum mechanics to extend gravitational physics beyond Einstein’s theory of general relativity—accounts for two major mysteries. While the differences in the theories occur at the tiniest of scales—much smaller than even a proton—they have consequences at the largest of accessible scales in the universe. The study, which appears online July 29 in the journal Physical Review Letters, also provides new predictions about the universe that future satellite missions could test.

While a zoomed-out picture of the looks fairly uniform, it does have a large-scale structure, for example because galaxies and dark matter are not uniformly distributed throughout the universe. The origin of this structure has been traced back to the tiny inhomogeneities observed in the Cosmic Microwave Background (CMB)—radiation that was emitted when the universe was 380 thousand years young that we can still see today. But the CMB itself has three puzzling features that are considered anomalies because they are difficult to explain using known physics.

“While seeing one of these anomalies may not be that statistically remarkable, seeing two or more together suggests we live in an exceptional universe,” said Donghui Jeong, associate professor of astronomy and astrophysics at Penn State and an author of the paper. “A recent study in the journal Nature Astronomy proposed an explanation for one of these anomalies that raised so many additional concerns, they flagged a ‘possible crisis in cosmology.’ Using quantum loop cosmology, however, we have resolved two of these anomalies naturally, avoiding that potential crisis.”

Skoltech scientists have shown that quantum enhanced machine learning can be used on quantum (as opposed to classical) data, overcoming a significant slowdown common to these applications and opening a “fertile ground to develop computational insights into quantum systems.” The paper was published in the journal Physical Review A.

Quantum computers utilize quantum mechanical effects to store and manipulate information. While quantum effects are often claimed to be counterintuitive, such effects will enable quantum enhanced calculations to dramatically outperform the best supercomputers. In 2019, the world saw a prototype of this demonstrated by Google as quantum computational superiority.

Quantum algorithms have been developed to enhance a range of different computational tasks; more recently this has grown to include quantum enhanced machine learning. Quantum machine learning was partly pioneered by Skoltech’s resident-based Laboratory for Quantum Information Processing, led by Jacob Biamonte, a coathor of this paper. “Machine learning techniques have become powerful tools for finding patterns in data. Quantum systems produce atypical patterns that are thought not to produce efficiently, so it is not surprising that quantum computers might outperform classical computers on machine learning tasks,” he says.

Here’s the story – our protagonist rewinds history, locates baby Hitler, and averts global war by putting him on a path to peace … but, oh noes! This sets off a domino chain of events that stops our hero from being born, or worse, kicks off the apocalypse.

Unintended ‘butterfly effect’-style consequences of time travel might be a juicy problem in science fiction, but physicists now have reason to believe in a quantum landscape, tweaking history in this way shouldn’t be a major problem.

Since going back to a previous moment in time is still in the ‘too hard’ basket, a pair of physicists from the Los Alamos National Laboratory in the US went with the next best thing and created a simulation using an IBM-Q quantum computer.

At the heart of every white dwarf star—the dense stellar object that remains after a star has burned away its fuel reserve of gases as it nears the end of its life cycle—lies a quantum conundrum: as white dwarfs add mass, they shrink in size, until they become so small and tightly compacted that they cannot sustain themselves, collapsing into a neutron star.

This puzzling relationship between a white dwarf’s mass and size, called the mass-radius relation, was first theorized by Nobel Prize-winning astrophysicist Subrahmanyan Chandrasekhar in the 1930s. Now, a team of Johns Hopkins astrophysicists has developed a method to observe the phenomenon itself using collected by the Sloan Digital Sky Survey and a recent dataset released by the Gaia Space Observatory. The combined datasets provided more than 3,000 white dwarfs for the team to study.

A report of their findings, led by Hopkins senior Vedant Chandra, is now in press in Astrophysical Journal and available online on arXiv.

A way of shrinking the devices used in quantum sensing systems has been developed by researchers at the UK Quantum Technology Hub Sensors and Timing, which is led by the University of Birmingham.

Sensing devices have a huge number of industrial uses, from carrying out ground surveys to monitoring volcanoes. Scientists working on ways to improve the capabilities of these sensors are now using quantum technologies, based on , to improve their sensitivity.

Machines developed in laboratories using quantum technology, however, are cumbersome and difficult to transport, making current designs unsuitable for most industrial uses.

Users Guide

Ultimately, the MIT engineers hope that their giant atoms lead to a simpler, enhanced form of quantum computers.

“This allows us to experimentally probe a novel regime of physics that is difficult to access with natural atoms,” MIT engineer Bharath Kannan said in a press release. “The effects of the giant atom are extremely clean and easy to observe and understand.”

Scientists have found that a physical property called ‘quantum negativity’ can be used to take more precise measurements of everything from molecular distances to gravitational waves.

The researchers, from the University of Cambridge, Harvard and MIT, have shown that can carry an unlimited amount of information about things they have interacted with. The results, reported in the journal Nature Communications, could enable far more precise measurements and power new technologies, such as super-precise microscopes and quantum computers.

Metrology is the science of estimations and measurements. If you weighed yourself this morning, you’ve done metrology. In the same way as is expected to revolutionize the way complicated calculations are done, quantum metrology, using the strange behavior of subatomic particles, may revolutionize the way we measure things.

Quantum computers have enormous potential for calculations using novel algorithms and involving amounts of data far beyond the capacity of today’s supercomputers. While such computers have been built, they are still in their infancy and have limited applicability for solving complex problems in materials science and chemistry. For example, they only permit the simulation of the properties of a few atoms for materials research.

Scientists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory and the University of Chicago (UChicago) have developed a method paving the way to using quantum computers to simulate realistic molecules and complex materials, whose description requires hundreds of atoms.

The research team is led by Giulia Galli, director of the Midwest Integrated Center for Computational Materials (MICCoM), a group leader in Argonne’s Materials Science division and a member of the Center for Molecular Engineering at Argonne. Galli is also the Liew Family Professor of Electronic Structure and Simulations in the Pritzker School of Molecular Engineering and a Professor of Chemistry at UChicago. She worked on this project with assistant scientist Marco Govoni and graduate student He Ma, both part of Argonne’s Materials Science division and UChicago.