What if neither distance nor language mattered? What if technology could help you be anywhere you need to be and speak any language? Using AI technology and holographic experiences this is possible, and it is revolutionary.
Microsoft has created a hologram that will transform someone into a digital speaker of another language. The software giant unveiled the technology during a keynote at the Microsoft Inspire partner conference this morning in Las Vegas. Microsoft recently scanned Julia White, a company executive for Azure, at a Mixed Reality capture studio to transform her into an exact hologram replica.
The digital version appeared onstage to translate the keynote into Japanese. Microsoft has used its Azure AI technologies and neural text-to-speech to make this possible. It works by taking recordings of White’s voice, in order to create a personalized voice signature, to make it sound like she’s speaking Japanese.
Microsoft has shown off holograms of people before, but the translation aspect is a step beyond what has been possible with HoloLens. This looks like it’s just a demonstration for now, and you’d need access to a Mixed Reality capture studio to even start to take advantage of this. Microsoft’s studios are equipped with lighting rigs and high-resolution cameras to capture a fully accurate digital hologram of someone, which isn’t something that can be done easily at home with a smartphone just yet.
The quantum interference of electrons that have been scattered by light has been used to produce holograms of the underlying electromagnetic fields — and might open up methods for studying materials at high temporal and spatial resolution. A fresh approach to imaging light fields.
Photography measures how much light of different color hits the photographic film. However, light is also a wave, and is therefore characterized by the phase. Phase specifies the position of a point within the wave cycle and correlates to depth of information, meaning that recording the phase of light scattered by an object can retrieve its full 3D shape, which cannot be obtained with a simple photograph. This is the basis of optical holography, popularized by fancy holograms in sci-fi movies like Star Wars.
But the problem is that the spatial resolution of the photo/hologram is limited by the wavelength of light, around or just-below 1 μm (0.001 mm). That’s fine for macroscopic objects, but it starts to fail when entering the realm of nanotechnology.
Now researchers from Fabrizio Carbone’s lab at EPFL have developed a method to see how light behaves on tiniest scale, well beyond wavelength limitations. The researchers used the most unusual photographic media: freely propagating electrons. Used in their ultrafast electron microscope, the method can encode quantum information in a holographic light pattern trapped in a nanostructure, and is based on an exotic aspect of electron and light interaction.
Are you — is every person you’ve ever loved, every incredible sight you’ve ever witnessed — part of a hologram? Some scientists think so.
They argue that all the information in the universe may be stored on some sort of two-dimensional object. In this video, NASA astronomer Michelle Thaller delves into frontier science — an unchartered territory that may require a new level of physics to better understand.
In what the German automaker is calling a “world premiere,” Volkswagen’s futuristic Golf GTI Aurora concept has a high-end sound system in its trunk that can be operated with a hologram.
You can leave your 3D glasses and augmented reality gloves at home: the hologram floats freely in the air and can be operated without any external aids. Though to be fair, VW is being very vague about the details of the technology behind the interface.
One day soon you may be filling your lungs with crisp ocean air, your arms bathed in warm light as the sun sets over softly lapping waters and you may wonder, is this real? Or are scientists projecting holograms into my brain to create a vivid sensory experience that isn’t actually happening? A group of researchers at University of California, Berkeley are in the early stages of testing their ability to create, edit and scrub sensory experiences from your brain, both real-time and stored experiences: memories.
Using light to make us see what isn’t there.
Different sensory experiences show up in brain imaging as patterns of neurons firing in sequence. Neuroscientists are trying to reverse-engineer experiences by stimulating the neurons to excite the same neural patterns. At present, the steps to accomplish this are a little invasive. Scientists genetically modify neurons with photosensitive proteins so they can gingerly manipulate neurons using light. The process is known as optogenetics. Also, a metal head plate gets surgically implanted over the targeted area.
In conventional holography a photographic film can record the interference pattern of monochromatic light scattered from the object to be imaged with a reference beam of un-scattered light. Scientists can then illuminate the developed image with a replica of the reference beam to create a virtual image of the original object. Holography was originally proposed by the physicist Dennis Gabor in 1948 to improve the resolution of an electron microscope, demonstrated using light optics. A hologram can be formed by capturing the phase and amplitude distribution of a signal by superimposing it with a known reference. The original concept was followed by holography with electrons, and after the invention of lasers optical holography became a popular technique for 3D imaging macroscopic objects, information encryption and microscopy imaging.
In the new method, the scientists relied on electromagnetic fields to split an electron wave function in a quantum coherent superposition of different energy states. The technique deviated from the conventional method, where the signal of interest and reference spatially separated and recombined to reconstruct the amplitude and phase of a signal of interest to subsequently form a hologram. The principle can be extended to any kind of detection configuration involving a periodic signal capable of undergoing interference, including sound waves, X-rays or femtosecond pulse waveforms.