Toggle light / dark theme

We love the sort of outlandish concept cars that treat themselves as high art, and regret that COVID-19 and the general death of the big auto show is increasingly denying us this pleasure. But here is a spectacular example of the genre to lift the late holiday weekend malaise: meet the WayRay Holograktor.

Gesture interface company Leap Motion is announcing an ambitious, but still very early, plan for an augmented reality platform based on its hand tracking system. The system is called Project North Star, and it includes a design for a headset that Leap Motion claims costs less than $100 at large-scale production. The headset would be equipped with a Leap Motion sensor, so users could precisely manipulate objects with their hands — something the company has previously offered for desktop and VR displays.

Project North Star isn’t a new consumer headset, nor will Leap Motion be selling a version to developers at this point. Instead, the company is releasing the necessary hardware specifications and software under an open source license next week. “We hope that these designs will inspire a new generation of experimental AR systems that will shift the conversation from what an AR system should look like, to what an AR experience should feel like,” the company writes.

The headset design uses two fast-refreshing 3.5-inch LCD displays with a resolution of 1600×1440 per eye. The displays reflect their light onto a visor that the user perceives as a transparent overlay. Leap Motion says this offers a field of view that’s 95 degrees high and 70 degrees wide, larger than most AR systems that exist today. The Leap Motion sensor fits above the eyes and tracks hand motion across a far wider field of view, around 180 degrees horizontal and vertical.

Over the past several decades, researchers have moved from using electric currents to manipulating light waves in the near-infrared range for telecommunications applications such as high-speed 5G networks, biosensors on a chip, and driverless cars. This research area, known as integrated photonics, is fast evolving and investigators are now exploring the shorter—visible—wavelength range to develop a broad variety of emerging applications. These include chip-scale LIDAR (light detection and ranging), AR/VR/MR (augmented/virtual/mixed reality) goggles, holographic displays, quantum information processing chips, and implantable optogenetic probes in the brain.

The one device critical to all these applications in the is an optical phase modulator, which controls the phase of a light wave, similar to how the phase of radio waves is modulated in wireless computer networks. With a phase modulator, researchers can build an on-chip that channels light into different waveguide ports. With a large network of these optical switches, researchers could create sophisticated integrated optical systems that could control light propagating on a tiny chip or light emission from the chip.

But phase modulators in the visible range are very hard to make: there are no materials that are transparent enough in the visible spectrum while also providing large tunability, either through thermo-optical or electro-optical effects. Currently, the two most suitable materials are silicon nitride and lithium niobate. While both are highly transparent in the visible range, neither one provides very much tunability. Visible-spectrum phase modulators based on these materials are thus not only large but also power-hungry: the length of individual waveguide-based modulators ranges from hundreds of microns to several mm and a single modulator consumes tens of mW for phase tuning. Researchers trying to achieve large-scale integration—embedding thousands of devices on a single microchip—have, up to now, been stymied by these bulky, energy-consuming devices.

Working at the intersection of hardware and software engineering, researchers are developing new techniques for improving 3D displays for virtual and augmented reality technologies.

Virtual and augmented reality headsets are designed to place wearers directly into other environments, worlds and experiences.

While the technology is already popular among consumers for its immersive quality, there could be a future where the holographic displays look even more like real life. In their own pursuit of these better displays, the Stanford Computational Imaging Lab has combined their expertise in optics and artificial intelligence. Their most recent advances in this area are detailed in a paper published in Science Advances and work that will be presented at SIGGRAPH ASIA 2021 in December.

In the wee morning hours of Tuesday (Nov. 16), the seven-person crew of the International Space Station (ISS) awoke in alarm. A Russian missile test had just blasted a decommissioned Kosmos spy satellite into more than 1,500 pieces of space debris — some of which were close enough to the ISS to warrant emergency collision preparations.

The four Americans, one German and two Russian cosmonauts aboard the station were told to shelter in the transport capsules that brought them to the ISS, while the station passed by the debris cloud several times over the following hours, according to NASA.

Ultimately, Tuesday ended without any reported damage or injury aboard the ISS, but the crew’s precautions — and the NASA administrator’s stern response to Russia — were far from an overreaction. Space debris like the kind created in the Kosmos break-up can travel at more than 17,500 mph (28,000 km/h), NASA says — and even a scrap of metal the size of a pea can become a potentially deadly missile in low-Earth orbit. (For comparison, a typical bullet discharged from an AR-15 rifle travels at just over 2,200 mph, or 3,500 km/h).

A Russian missile test blasted a Kosmos spy satellite into more than 1,500 pieces of space debris.

Facebook’s vision of the Metaverse has been criticized by both consumers & other companies for its obvious dystopian outlook. But one of the most prominent Augmented Reality Companies in the world, Niantic has shown a much better looking futuristic vision of the metaverse. One in which the real world would only get augmented instead of completely replaced like in Meta’s vision of it. Niantic’s Lightship platform and future augmented reality glasses are meant to be a look into a future where privacy and social interactions are of uttermost importance and the dystopian nightmare future wouldn’t be a big problem. Let’s see what companies such as Apple or niantic think of this.

00:00 The unfortunate fate of the Metaverse.
02:01 What is this future going to look like?
03:59 Facebook’s Creepy Vision of the Workplace.
06:29 A possible solution by Niantic.
08:35 Last Words.

#facebook #meta #metaverse

Apple and Meta are heading toward a collision course around wearables, AR/VR headsets and home devices. Also: Netflix and Apple mend fences around billing, Tim Cook talks cryptocurrency, and a new Apple Store is coming to Los Angeles. Finally, the App Store is dealt a loss in court.

For the past decade or so, Apple Inc.’s chief rival was considered to be Google. The two have gone toe-to-toe in smartphones, mobile operating systems, web services and home devices.

The next decade, however, could be defined by Apple’s rivalry with another Silicon Valley giant: Meta Platforms Inc.—the company known to everyone other than its own brand consultants as Facebook.

WIRED sat down with West to sift fantasy from reality and pin down what XR is actually good at. And it may come as a surprise that a lot of it relies on collecting a lot of data. The following interview is a transcript of our conversation, lightly edited for clarity and length.

WIRED: So let’s start with sort of an ontological question. There’s been this idea that we’ll be in or go to the metaverse, or several metaverses, which tech companies posit will exist in VR or AR. Do you see VR and AR as being more of a tool or a destination?

Timoni West: That’s a great question. I would actually say neither. I see XR as one of the many different mediums you could choose to work in. For example, we actually have an AR mobile companion app [in beta] that allows you to scan a space and gray box it out, put down objects, automatically tag things. So I’m using AR to do the things that AR is best for. I’ll use VR to do the things that VR is best for, like presence, being able to meet together, sculpt, or do anything that’s, you know, sort of intrinsically 3D.

Smartphones have become old technology by now and is soon going to be replaced by the next big thing. Vuzix has created the first lightweight Smart AR Glasses that can project holograms at high contrast while still looking like regular glasses.

The Vuzix next generation smart glasses feature futuristic micro-LED display technology to project augmented reality images onto the glasses. You can interact with virtual objects and more. Companies like Apple and Facebook will soon follow with their own variations of AR Glasses due to them soon replacing smartphones as the main medium of interaction as they phones become obsolete.

If you enjoyed this video, please consider rating this video and subscribing to our channel for more frequent uploads. Thank you! smile

#vuzix #smartphones #augmentedreality

An innovator in early AR systems has a dire prediction: the metaverse could change the fabric of reality as we know it.

Louis Rosenberg, a computer scientist and developer of the first functional AR system at the Air Force Research Laboratory, penned an op-ed in Big Think this weekend that warned the metaverse — an immersive VR and AR world currently being developed by The Company Formerly Known as Facebook — could create what sounds like a real life cyberpunk dystopia.

“I am concerned about the legitimate uses of AR by the powerful platform providers that will control the infrastructure,” Rosenberg wrote in the essay.