Toggle light / dark theme

Researchers have fashioned ultrathin silicon nanoantennas that trap and redirect light, for applications in quantum computing, LIDAR and even the detection of viruses.

Light is notoriously fast. Its speed is crucial for rapid information exchange, but as light zips through materials, its chances of interacting and exciting atoms and molecules can become very small. If scientists can put the brakes on light particles, or photons, it would open the door to a host of new technology applications.

Now, in a paper published on August 17, 2020, in Nature Nanotechnology, Stanford scientists demonstrate a new approach to slow light significantly, much like an echo chamber holds onto sound, and to direct it at will. Researchers in the lab of Jennifer Dionne, associate professor of materials science and engineering at Stanford, structured ultrathin silicon chips into nanoscale bars to resonantly trap light and then release or redirect it later. These “high-quality-factor” or “high-Q” resonators could lead to novel ways of manipulating and using light, including new applications for quantum computing, virtual reality and augmented reality; light-based WiFi; and even the detection of viruses like SARS-CoV-2.

Light is notoriously fast. Its speed is crucial for rapid information exchange, but as light zips through materials, its chances of interacting and exciting atoms and molecules can become very small. If scientists can put the brakes on light particles, or photons, it would open the door to a host of new technology applications.

Now, in a paper published on Aug. 17, in Nature Nanotechnology, Stanford scientists demonstrate a new approach to slow light significantly, much like an echo chamber holds onto sound, and to direct it at will. Researchers in the lab of Jennifer Dionne, associate professor of materials science and engineering at Stanford, structured ultrathin silicon chips into nanoscale bars to resonantly trap light and then release or redirect it later. These “high-quality-factor” or “high-Q” resonators could lead to novel ways of manipulating and using light, including new applications for quantum computing, virtual reality and augmented reality; light-based WiFi; and even the detection of viruses like SARS-CoV-2.

“We’re essentially trying to trap light in a tiny box that still allows the light to come and go from many different directions,” said postdoctoral fellow Mark Lawrence, who is also lead author of the paper. “It’s easy to trap light in a box with many sides, but not so easy if the sides are transparent—as is the case with many Silicon-based applications.”

“It’s not just about the smell,” said Adrian Cheok, one of the scientists behind the experiments. “It is part of a whole, integrated virtual reality or augmented reality. So, for example, you could have a virtual dinner with your friend through the internet. You can see them in 3D and also share a glass of wine together.”

In real life, odors are transmitted when airborne molecules waft into the nose, prompting specialized nerve cells in the upper airway to fire off impulses to the brain. In the recent experiments, performed on 31 test subjects at the Imagineering Institute in the Malaysian city of Nusajaya, researchers used electrodes in the nostrils to deliver weak electrical currents above and behind the nostrils, where these neurons are found.

The researchers were able to evoke 10 different virtual odors, including fruity, woody and minty.

Apple’s AR glasses are supposedly called Apple Glass, a leaker revealed, and the product is set to be unveiled during the iPhone 12 launch event. The coronavirus health crisis might force Apple to postpone the reveal to the first quarter of next year.

People, bicycles, cars or road, sky, grass: Which pixels of an image represent distinct foreground persons or objects in front of a self-driving car, and which pixels represent background classes?

This task, known as panoptic segmentation, is a fundamental problem that has applications in numerous fields such as self-driving cars, robotics, augmented reality and even in biomedical image analysis.

At the Department of Computer Science at the University of Freiburg Dr. Abhinav Valada, Assistant Professor for Robot Learning and member of BrainLinks-BrainTools focuses on this research question. Valada and his team have developed the state-of-the-art “EfficientPS” artificial intelligence (AI) model that enables coherent recognition of visual scenes more quickly and effectively.

Augmented reality has been the next big thing for a while, but we haven’t seen many practical applications. Here’s a tool that looks useful, though: using AR and AI to copy and paste objects from the real world to your computer using just your phone.

A key component of NASA’s X-59 Quiet SuperSonic Technology (QueSST) aircraft is undergoing vibration tests at the space agency’s Langley Research Center in Hampton, Virginia. The eXternal Vision System (XVS) is a special camera system that the pilot of the X-plane will use to see forward while the experimental supersonic craft is in flight.

When the X-59 takes to the skies in 2021, the pilot will be faced with a problem not often encountered since the Concorde fleet of supersonic passenger jetliners was retired. The X-59 is meant to test new technologies to build a new generation of supersonic commercial aircraft and, while it promises to overcome some of the drawbacks of Concorde, it will still share some of its difficulties.

One is that the ideal design of a long-range supersonic liner is essentially that of a needle-nosed dart. The annoying thing is that, though this shape may be fine from an aerodynamic point of view, it makes it extremely difficult for the pilot to see forward without a lot of complex mechanics, like Concorde’s droop nose and special sliding windscreen.