Toggle light / dark theme

By Andrew Zimmerman Jones, Daniel Robbins

According to string theory, all particles in the universe can be divided into two types: bosons and fermions. String theory predicts that a type of connection, called supersymmetry, exists between these two particle types.

Under supersymmetry, a fermion must exist for every boson and a boson for every fermion. Unfortunately, experiments have not yet detected these extra particles.

20th century physics has seen two major paradigm shifts in the way we understand Mother Nature. One is quantum mechanics, and the other is relativity. The marriage between the two, called quantum field theory, conceived an enfant terrible, namely anti-matter. As a result, the number of elementary particles doubled. We believe that 21st century physics is aimed at yet another level of marriage, this time between quantum mechanics and general relativity, Einstein’s theory of gravity. The couple has not been getting along very well, resulting in mathematical inconsistencies, meaningless infinities, and negative probabilities. The key to success may be in supersymmetry, which doubles the number of particles once more.

Why was anti-matter needed? One reason was to solve a crisis in the 19th century physics of classical electromagnetism. An electron is, to the best of our knowledge, a point particle. Namely, it has no size, yet an electric charge. A charged particle inevitably produces an electric potential around it, and it also feels the potential created by itself. This leads to an infinite “self-energy” of the electron. In other words, it takes substantial energy to “pack” all the charge of an electron into small size.

On the other hand, Einstein’s famous equation says that mass of a particle determines the energy of the particle at rest. For an electron, its rest energy is known to be 0.511 MeV. For this given amount of energy, it cannot afford to “pack” itself into a size smaller than the size of a nucleus. Classical theory of electromagnetism is not a consistent theory below this distance. However, it is known that the electron is at least ten thousand times smaller than that.

M-theory is a theory in physics that unifies all consistent versions of superstring theory. The existence of such a theory was first conjectured by Edward Witten at a string theory conference at the University of Southern California in the Spring of 1995. Witten’s announcement initiated a flurry of research activity known as the second superstring revolution.

Prior to Witten’s announcement, string theorists had identified five versions of superstring theory. Although these theories appeared, at first, to be very different, work by several physicists showed that the theories were related in intricate and nontrivial ways. In particular, physicists found that apparently distinct theories could be unified by mathematical transformations called S–duality and T–duality. Witten’s conjecture was based in part on the existence of these dualities and in part on the relationship of the string theories to a field theory called eleven-dimensional supergravity.

Although a complete formulation of M-theory is not known, the theory should describe two- and five-dimensional objects called branes and should be approximated by eleven-dimensional supergravity at low energies. Modern attempts to formulate M-theory are typically based on matrix theory or the AdS/CFT correspondence.

This would be good for hoverboards and aircrafts.


Physicists have discovered a novel quantum state of matter whose symmetry can be manipulated at will by an external magnetic field. The methods demonstrated in a series of experiments could be useful for exploring materials for next-generation nano- or quantum technologies.

Close.

A new unified theory for heat transport accurately describes a wide range of materials – from crystals and polycrystalline solids to alloys and glasses – and allows them to be treated in the same way for the first time. The methodology, which is based on the Green-Kubo theory of linear response and concepts from lattice dynamics, naturally accounts for quantum mechanical effects and thus allows for the predictive modelling of heat transport in glasses at low temperature – a feat never achieved before, say the researchers who developed it. It will be important for better understanding and designing heat transporting devices in a host of applications, from heat management in high-power electronics, batteries and photovoltaics to thermoelectric energy harvesting and solid-state cooling. It might even help describe heat flow in planetary systems.

“Heat transport is the fundamental mechanism though which thermal equilibrium is reached,” explains Stefano Baroni of the Scuola Internazionale Superiore di Studi Avanzati (SISSA) in Trieste, Italy, who led this research effort. “It can also be thought of as the most fundamental manifestation of irreversibility in nature – as heat flows from warm areas in the same system to cooler ones as time flows from the past to the future (the ‘arrow of time’). What is more, many modern technologies rely on our ability to control heat transport.”

However, despite its importance, heat transport is still poorly understood and it is difficult to simulate the heat transport of materials because of this lack of understanding. To overcome this knowledge gap, researchers employ various simulation techniques based on diverse physical assumptions and approximations for different classes of material – crystals on one hand and disordered solids and liquids on the other.

A handful of spins in diamond have shone new light on one of the most enduring mysteries in physics – how the objective reality of classical physics emerges from the murky, probabilistic quantum world. Physicists in Germany and the US have used nitrogen-vacancy (NV) centres in diamond to demonstrate “quantum Darwinism”, whereby the “fittest” states of a system survive and proliferate in the transition between the quantum and classical worlds.

In the past, physicists tended to view the classical and quantum worlds as being divided by an abrupt barrier that makes a fundamental distinction between the familiar macroscopic (classical) and the unfamiliar microscopic (quantum) realms. But in recent decades that view has changed. Many experts now think that the transition is gradual, and that the definite classical states we measure come from probabilistic quantum states progressively (although very quickly) losing their coherence as they become ever more entangled with their environment.

Quantum Darwinism, put forward by Wojciech Zurek of Los Alamos National Laboratory in New Mexico, argues that the classical states we perceive are robust quantum states that can survive entanglement during decoherence. His theoretical framework posits that the information about these states will be duplicated many times and disseminated throughout the environment. Just as natural selection tells us that the fittest individuals in a species must survive to reproduce in great numbers and so go on to shape evolution, the fittest quantum states will be copied and appear classical. This redundancy means that many individual observers will measure any given state as having the same value, so ensuring objective reality.

Time has a fundamentally different character in quantum mechanics and when in general relativity. In quantum theory, events develop in a fixed order while in general relativity temporal order is affected by the distribution of matter.

Now, a team of international scientists has discovered a new kind of quantum time order. Through this study, scientists sought to determine: what happens when an object massive enough to influence the flow of time is placed in a quantum state?

The disclosure emerged from a test the group intended to bring together elements of the two significant physics theories developed in the past century.

A team of researchers has just demonstrated quantum enhancement in an actual X-ray machine, achieving the desirable goal of eliminating background noise for precision detection.

The relationships between photon pairs on quantum scales can be exploited to create sharper, higher-resolution images than classical optics. This emerging field is called quantum imaging, and it has some really impressive potential — particularly since, using optical light, it can be used to show objects that can’t usually be seen, like bones and organs.

Quantum correlation describes a number of different relationships between photon pairs. Entanglement is one of these, and is applied in optical quantum imaging.

The quantum internet promises absolutely tap-proof communication and powerful distributed sensor networks for new science and technology. However, because quantum information cannot be copied, it is not possible to send this information over a classical network. Quantum information must be transmitted by quantum particles, and special interfaces are required for this. The Innsbruck-based experimental physicist Ben Lanyon, who was awarded the Austrian START Prize in 2015 for his research, is investigating these important intersections of a future quantum Internet.

Now his team at the Department of Experimental Physics at the University of Innsbruck and at the Institute of Quantum Optics and Quantum Information of the Austrian Academy of Sciences has achieved a record for the transfer of quantum entanglement between matter and light. For the first time, a distance of 50 kilometers was covered using fiber optic cables. “This is two orders of magnitude further than was previously possible and is a practical distance to start building inter-city quantum networks,” says Ben Lanyon.

This week a collaborative effort among computer scientists and academics to safeguard data is winning attention and it has quantum computing written all over it.

The Netherlands’ Centrum Wiskunde & Informatica (CWI), national research institute for mathematics and computer science, had the story: IBM Research developed “quantum-safe algorithms” for securing data. They have done so by working with international partners including CWI and Radboud University in the Netherlands.

IBM and partners share concerns that data protected by current encryption methods may become insecure within the next 10 to 30 years.