Toggle light / dark theme

Circa 2020


Lasing—the emission of a collimated light beam of light with a well-defined wavelength (color) and phase—results from a self-organization process, in which a collection of emission centers synchronizes itself to produce identical light particles (photons). A similar self-organized synchronization phenomenon can also lead to the generation of coherent vibrations—a phonon laser, where phonon denotes, in analogy to photons, the quantum particles of sound.

Photon lasing was first demonstrated approximately 60 years ago and, coincidentally, 60 years after its prediction by Albert Einstein. This stimulated emission of amplified found an unprecedented number of scientific and technological applications in multiple areas.

Although the concept of a “laser of sound” was predicted almost at the same time, only few implementations have so far been reported and none has attained technological maturity. Now, a collaboration between researchers from Instituto Balseiro and Centro Atómico in Bariloche (Argentina) and Paul-Drude-Institut in Berlin has introduced a novel approach for the efficient generation of coherent vibrations in the tens of GHz range using semiconductor structures. Interestingly, this approach to the generation of coherent phonons is based on another of Einstein’s predictions: that of the 5th state of matter, a Bose-Einstein condensate (BEC) of coupled light-matter particles (polaritons).

Place one clock at the top of a mountain. Place another on the beach. Eventually, you’ll see that each clock tells a different time. Why?


In his book “The Order of Time,” Italian theoretical physicist Carlo Rovelli suggests that our perception of time — our sense that time is forever flowing forward — could be a highly subjective projection. After all, when you look at reality on the smallest scale (using equations of quantum gravity, at least), time vanishes.

“If I observe the microscopic state of things,” writes Rovelli, “then the difference between past and future vanishes … in the elementary grammar of things, there is no distinction between ‘cause’ and ‘effect.’”

So, why do we perceive time as flowing forward? Rovelli notes that, although time disappears on extremely small scales, we still obviously perceive events occur sequentially in reality. In other words, we observe entropy: Order changing into disorder; an egg cracking and getting scrambled.

In high-end 21st century communications, information travels in the form of a stream of light pulses typically traveling through fiber optic cables. Each pulse can be as faint as a single photon, the smallest possible unit (quantum) of light. The speed at which such systems can operate depends critically on how fast and how accurately detectors on the receiving end can discriminate and process those photons.

Now scientists at the National institute of Standards and Technology (NIST) have devised a method that can detect individual photons at a rate 10 times faster than the best existing technology, with lower error rates, higher detection efficiency, and less noise.

“While classical communication and detection can operate at blazing speeds, , which need that ultimate sensitivity for those faintest of pulses, are limited to much lower speeds,” said group leader Alan Migdall. “Combining that ultimate sensitivity with the ability to achieve the counting of photons at has been a long-standing challenge. Here we are pushing both performance limits all in the same device.”

Circa 2014 essentially this could make endless computer chips from light.


Princeton researchers have managed to cause light to behave like a crystal within a specialized computer chip, according to a recent paper. This is the first time anyone has accomplished this effect in a lab.

Here’s why it’s so hard: Atoms can easily form solids, liquids, and gasses, because when they come into contact they push and pull on each other. That push and pull forms the underlying structure of all matter. Light particles, or photons, do not typically interact with one another, according to Dr. Andrew Houck, a professor of electrical engineering at Princeton and an author on the study. The trick of this research was forcing them to do just that.

“We build essentially an artificial atom, using lots of atoms acting in concert,” Houck tells Popular Science, “What emerges is a quantum mechanical object that [at about half a millimeter] is visible on the classical scale.”

Superconductivity – the ability of a material to transmit an electric current without loss – is a quantum effect that, despite years of research, is still limited to very low temperatures. Now a team of scientists at the MPSD has succeeded in creating a metastable state with vanishing electrical resistance in a molecular solid by exposing it to finely tuned pulses of intense laser light. This effect had already been demonstrated in 2016 for only a very short time, but in a new study the authors of the paper have shown a far longer lifetime, nearly 10000 times longer than before. The long lifetimes for light-induced superconductivity hold promise for applications in integrated electronics. The research by Budden et al. has been published in Nature Physics.

Superconductivity is one of the most fascinating and mysterious phenomena of modern physics. It describes the sudden loss of electrical resistance in certain materials when they are cooled below a critical temperature. However, the need for such cooling still limits the technological usability of these materials.

In recent years, research by Andrea Cavalleri’s group at the MPSD has revealed that intense pulses of infrared light are a viable tool to induce superconducting properties in a variety of different materials at much higher temperatures than would be possible without photo-stimulation. However, these exotic states have so far persisted for only a few picoseconds (trillionths of a second), thus limiting the experimental methods for studying them to ultrafast optics.

Diamond-Based Quantum Accelerator Puts #Qubits in a Server Rack.

The startup Quantum Brilliance recently announced that they have developed a market-ready, diam… See More.


Its makers envision this device growing to 50+ qubits and fitting aboard satellites, autonomous vehicles.

Although universal fault-tolerant quantum computers – with millions of physical quantum bits (or qubits) – may be a decade or two away, quantum computing research continues apace. It has been hypothesized that quantum computers will one day revolutionize information processing across a host of military and civilian applications from pharmaceuticals discovery, to advanced batteries, to machine learning, to cryptography. A key missing element in the race toward fault-tolerant quantum systems, however, is meaningful metrics to quantify how useful or transformative large quantum computers will actually be once they exist.

To provide standards against which to measure quantum computing progress and drive current research toward specific goals, DARPA announced its Quantum Benchmarking program. Its aim is to re-invent key quantum computing metrics, make those metrics testable, and estimate the required quantum and classical resources needed to reach critical performance thresholds.

“It’s really about developing quantum computing yardsticks that can accurately measure what’s important to focus on in the race toward large, fault-tolerant quantum computers,” said Joe Altepeter, program manager in DARPA’s Defense Sciences Office. “Building a useful quantum computer is really hard, and it’s important to make sure we’re using the right metrics to guide our progress towards that goal. If building a useful quantum computer is like building the first rocket to the moon, we want to make sure we’re not quantifying progress toward that goal by measuring how high our planes can fly.”

Fueled by the need for faster life sciences and healthcare research, especially in the wake of the deadly COVID-19 pandemic, IBM and the 100-year-old Cleveland Clinic are partnering to bolster the Clinic’s research capabilities by integrating a wide range of IBM’s advanced technologies in quantum computing, AI and the cloud.

Access to IBM’s quantum systems has so far been primarily cloud-based, but IBM is providing the Cleveland Clinic with IBM’s first private-sector, on-premises quantum computer in the U.S. Scheduled for delivery next year, the initial IBM Quantum System One will harness between 50 to 100 qubits, according to IBM, but the goal is to stand up a more powerful, more advanced, next-generation 1000+ qubit quantum system at the Clinic as the project matures.

For the Cleveland Clinic, the 10-year partnership with IBM will add huge research capabilities and power as part of an all-new Discovery Center being created at the Clinic’s campus in Cleveland, Ohio. The Accelerator will serve as the technology foundation for the Clinic’s new Global Center for Pathogen Research & Human Health, which is being developed to drive research in areas including genomics, single-cell transcriptomics, population health, clinical applications and chemical and drug discovery, according to the Clinic.

About a year ago, Honeywell announced that it had entered the quantum computing race with a technology that was different from anything else on the market. The company claimed that because the performance of its qubits was so superior to those of its competitors, its computer could do better on a key quantum computing benchmark than quantum computers with far more qubits.

Now, roughly a year later, the company finally released a paper describing the feat in detail. But in the meantime, the competitive landscape has shifted considerably.