A new set of equations can precisely describe the reflections of the Universe that appear in the warped light around a black hole.
The proximity of each reflection is dependent on the angle of observation with respect to the black hole, and the rate of the black hole’s spin, according to a mathematical solution worked out by physics student Albert Sneppen of the Niels Bohr Institute in Denmark.
This is really cool, absolutely, but it’s not just really cool. It also potentially gives us a new tool for probing the gravitational environment around these extreme objects.
Sky surveys are invaluable for exploring the universe, allowing celestial objects to be catalogued and analyzed without the need for lengthy observations. But in providing a general map or image of a region of the sky, they are also one of the largest data generators in science, currently imaging tens of millions to billions of galaxies over the lifetime of an individual survey. In the near future, for example, the Vera C. Rubin Observatory in Chile will produce 20 TB of data per night, generate about 10 million alerts daily, and end with a final data set of 60 PB in size.
As a result, sky surveys have become increasingly labor-intensive when it comes to sifting through the gathered datasets to find the most relevant information or new discovery. In recent years machine learning has added a welcome twist to the process, primarily in the form of supervised and unsupervised algorithms used to train the computer models that mine the data. But these approaches present their own challenges; for example, supervised learning requires image labels that must be manually assigned, a task that is not only time-consuming but restrictive in scope; at present, only about 1% of all known galaxies have been assigned such labels.
To address these limitations, a team of researchers from Lawrence Berkeley National Laboratory (Berkeley Lab) is exploring a new tack: self-supervised representation learning. Like unsupervised learning, self-supervised learning eliminates the need for training labels, instead attempting to learn by comparison. By introducing certain data augmentations, self-supervised algorithms can be used to build “representations”—low-dimensional versions of images that preserve their inherent information—and have recently been demonstrated to outperform supervised learning on industry-standard image datasets.
Early in its history, computing was dominated by time-sharing systems. These systems were powerful machines (for their time, at least) that multiple users connected to in order to perform computing tasks. To an extent, quantum computing has repeated this history, with companies like Honeywell, IBM, and Rigetti making their machines available to users via cloud services. Companies pay based on the amount of time they spend executing algorithms on the hardware.
For the most part, time-sharing works out well, saving companies the expenses involved in maintaining the machine and its associated hardware, which often includes a system that chills the processor down to nearly absolute zero. But there are several customers—companies developing support hardware, academic researchers, etc.—for whom access to the actual hardware could be essential.
The fact that companies aren’t shipping out processors suggests that the market isn’t big enough to make production worthwhile. But a startup from the Netherlands is betting that the size of the market is about to change. On Monday, a company called QuantWare announced that it will start selling quantum processors based on transmons, superconducting loops of wire that form the basis of similar machines used by Google, IBM, and Rigetti.
In 2015, after 85 years of searching, researchers confirmed the existence of a massless particle called the Weyl fermion. With the unique ability to behave as both matter and anti-matter inside a crystal, this quasiparticle is like an electron with no mass. The story begun in 1928 when Dirac proposed an equation for the foundational unification of quantum mechanics and special relativity in describing the nature of the electron. This new equation suggested three distinct forms of relativistic particles: the Dirac, the Majorana, and the Weyl fermions. And recently, an analog of Weyl fermions has been discovered in certain electronic materials exhibiting a strong spin orbit coupling and topological behavior. Just as Dirac fermions emerge as signatures of topological insulators, in certain types of semimetals, electrons can behave like Weyl fermions.
These Weyl fermions are what can be called quasiparticles, which means they can only exist in a solid such as a crystal, and not as standalone particles. However, as complex as quasiparticles sound, their behavior is actually much simpler than that of fundamental particles, because their properties allow them to shrug off the same forces that knock their counterparts around. This discovery of Weyl fermions is huge, not just because there is finally a proof that these elusive particles exist, but because it paves the way for far more efficient electronics, and new types of quantum computing. Weyl fermions could be used to solve the traffic jams with electrons in electronics. In fact, Weyl electrons can carry charges at least 1000 times faster than electrons in ordinary semiconductors, and twice as fast as inside graphene. This could lead to a whole new type of electronics called ‘Weyltronics’.
Google, Nvidia, and others are training algorithms in the dark arts of designing semiconductors—some of which will be used to run artificial intelligence programs.
The US Defense Advanced Research Projects Agency (DARPA) has selected three teams of researchers led by Raytheon, BAE Systems, and Northrop Grumman to develop event-based infrared (IR) camera technologies under the Fast Event-based Neuromorphic Camera and Electronics (FENCE) program. It is designed to make computer vision cameras more efficient by mimicking how the human brain processes information. DARPA’s FENCE program aims to develop a new class of low-latency, low-power, event-based infrared focal plane array (FPA) and digital signal processing (DSP) and machine learning (ML) algorithms. The development of these neuromorphic camera technologies will enable intelligent sensors that can handle more dynamic scenes and aid future military applications.
New intelligent event-based — or neuromorphic — cameras can handle more dynamic scenes.
An elegant new algorithm developed by Danish researchers can significantly reduce the resource consumption of the world’s computer servers. Computer servers are as taxing on the climate as global air traffic combined, thereby making the green transition in IT an urgent matter. The researchers, from the University of Copenhagen, expect major IT companies to deploy the algorithm immediately.
One of the flipsides of our runaway internet usage is its impact on climate due to the massive amount of electricity consumed by computer servers. Current CO2 emissions from data centers are as high as from global air traffic combined—with emissions expected to double within just a few years.
Only a handful of years have passed since Professor Mikkel Thorup was among a group of researchers behind an algorithm that addressed part of this problem by producing a groundbreaking recipe to streamline computer server workflows. Their work saved energy and resources. Tech giants including Vimeo and Google enthusiastically implemented the algorithm in their systems, with online video platform Vimeo reporting that the algorithm had reduced their bandwidth usage by a factor of eight.
“Because nothing can protect hardware, software, applications or data from a quantum-enabled adversary, encryption keys and data will require re-encrypting with a quantum-resistant algorithm and deleting or physically securing copies and backups.” v/@preskil… See More.
To ease the disruption caused by moving away from quantum-vulnerable cryptographic code, NIST has released a draft document describing the first steps of that journey.
Physics-informed machine learning might help verify microchips.
Physicists love recreating the world in software. A simulation lets you explore many versions of reality to find patterns or to test possibilities. But if you want one that’s realistic down to individual atoms and electrons, you run out of computing juice pretty quickly.
Machine-learning models can approximate detailed simulations, but often require lots of expensive training data. A new method shows that physicists can lend their expertise to machine-learning algorithms, helping them train on a few small simulations consisting of a few atoms, then predict the behavior of system with hundreds of atoms. In the future, similar techniques might even characterize microchips with billions of atoms, predicting failures before they occur.
The researchers started with simulated units of 16 silicon and germanium atoms, two elements often used to make microchips. They employed high-performance computers to calculate the quantum-mechanical interactions between the atoms’ electrons. Given a certain arrangement of atoms, the simulation generated unit-level characteristics such as its energy bands, the energy levels available to its electrons. But “you realize that there is a big gap between the toy models that we can study using a first-principles approach and realistic structures,” says Sanghamitra Neogi, a physicist at the University of Colorado, Boulder, and the paper’s senior author. Could she and her co-author, Artem Pimachev, bridge the gap using machine learning?