The immersive tech could eventually allow park visitors to interact with Mickey Mouse and Elsa as images, not cast members in costume.
Disney is joining the metaverse party.
We aren’t talking online gigs or business meetings with avatars. Disney wants to enhance the virtual dimension of its theme parks with its Virtual World Simulator, new technology for which it was granted a patent in the U.S. on December 28.
The system could be used as follows: a user enters a venue or ride in which images are projected onto flat and curved surfaces, creating an immersive virtual environment. The user’s movements are tracked and the projections change accordingly, maintaining the sense of a complex, coherent world. Their shifting viewpoint is gauged with a technique called Simultaneous Localization and Mapping, or SLAM.
Life is teeming nearly everywhere in the oceans, except in certain pockets where oxygen naturally plummets and waters become unlivable for most aerobic organisms. These desolate pools are “oxygen-deficient zones,” or ODZs. And though they make up less than 1 percent of the ocean’s total volume, they are a significant source of nitrous oxide, a potent greenhouse gas. Their boundaries can also limit the extent of fisheries and marine ecosystems.
Now MIT scientists have generated the most detailed, three-dimensional “atlas” of the largest ODZs in the world. The new atlas provides high-resolution maps of the two major, oxygen-starved bodies of water in the tropical Pacific. These maps reveal the volume, extent, and varying depths of each ODZ, along with fine-scale features, such as ribbons of oxygenated water that intrude into otherwise depleted zones.
The team used a new method to process over 40 years’ worth of ocean data, comprising nearly 15 million measurements taken by many research cruises and autonomous robots deployed across the tropical Pacific. The researchers compiled then analyzed this vast and fine-grained data to generate maps of oxygen-deficient zones at various depths, similar to the many slices of a three-dimensional scan.
Summary: Neurons in the primary olfactory cortex play a role in encoding spatial maps, a new study reports.
Source: champalimaud centre for the unknown.
Smell has the power to transport us across time and space. It could be the sweet fragrance of jasmine, or the musty scent of algae. Suddenly, you are back at your childhood home, or under the burning sun of a distant shore.
OneZoom is a one-stop site for exploring all life on Earth, its evolutionary history, and how much of it is threatened with extinction.
The OneZoom explorer – available at onezoom.org – maps the connections between 2.2 million living species, the closest thing yet to a single view of all species known to science. The interactive tree of life allows users to zoom in to any species and explore its relationships with others, in a seamless visualisation on a single web page. The explorer also includes images of over 85,000 species, plus, where known, their vulnerability to extinction.
OneZoom was developed by Imperial College London biodiversity researcher Dr. James Rosindell and University of Oxford evolutionary biologist Dr. Yan Wong. In a paper published today in Methods in Ecology and Evolution, Drs Wong and Rosindell present the result of over ten years of work, gradually creating what they regard as “the Google Earth of biology.”
In a paper published today in the scientific journal Science, DeepMind demonstrates how neural networks can be used to describe electron interactions in chemical systems more accurately than existing methods.
Density Functional Theory, established in the 1960s, describes the mapping between electron density and interaction energy. For more than 50 years, the exact nature of mapping between electron density and interaction energy—the so-called density functional—has remained unknown. In a significant advancement for the field, DeepMind has shown that neural networks can be used to build a more accurate map of the density and interaction between electrons than was previously attainable.
By expressing the functional as a neural network and incorporating exact properties into the training data, DeepMind was able to train the model to learn functionals free from two important systematic errors—the delocalisation error and spin symmetry breaking—resulting in a better description of a broad class of chemical reactions.
Microsoft has announced a new DirectX12 API for Windows which will offer a new way for apps to efficiently encode video using the GPU.
The Video Encode API is available to 3rd party apps and is native to Windows 11, and can efficiently encode video in the H264 and HEVC formats.
Microsoft says it offers a considerable number of configurable parameters are exposed by this API for the user to tweak different aspects of the encoding process and make them fit best for their scenarios such as: custom slices partitioning scheme, active (i.e. CBR, VBR, QBVR) and passive (Absolute/Delta custom QP maps) rate control configuration modes, custom codec encoding tools usage, custom codec block and transform sizes, motion vector precision limit, explicit usage of intra-refresh sessions, dynamic reconfiguration of video stream resolution/rate control/slices partitioning and more.
Israeli drone manufacturer Airobotics has collaborated with Israeli solar farm services company Solar Drone to develop and supply to Solar Drone a unique solar panel cleaning drone system. The fully automated system will include a drone docking station for automatic battery replacement and cleaning fluid replenishment, enabling the system to operate continuously.
While solar power and solar panels are essentially maintenance-free systems, but solar panels do require cleaning from time to time to enable proper function. Dirt, dust, mud, and bird dropping greatly reduce solar panel efficiency, impacting power output. Frequent cleaning is expensive and time-consuming, especially when panels are remote, difficult to access, or difficult to clean.
A new “drone-in-a-box”-type system is now being developed to do this job. A quadrocopter is housed inside a weatherproof dock located near the solar panels. At regular intervals, the station doors on top will open, releasing the drone. The drone will then take off and fly up to the panels, using LiDAR sensors and mapping cameras for more accurate positioning. Each panel will be sprayed with a cleaning fluid, and after completing the task, the drone will return to the docking station. If necessary, the robotic system will replace the discharged battery with the charged one and replace its cleaning fluid container with a full one.