Toggle light / dark theme

Advanced uses of time in image rendering and reconstruction have been the focus of much scientific research in recent years. The motivation comes from the equivalence between space and time given by the finite speed of light c. This equivalence leads to correlations between the time evolution of electromagnetic fields at different points in space. Applications exploiting such correlations, known as time-of-flight (ToF)1 and light-in-flight (LiF)2 cameras, operate at various regimes from radio3,4 to optical5 frequencies. Time-of-flight imaging focuses on reconstructing a scene by measuring delayed stimulus responses via continuous wave, impulses or pseudo-random binary sequence (PRBS) codes1. Light-in-flight imaging, also known as transient imaging6, explores light transport and detection2,7. The combination of ToF and LiF has recently yielded higher accuracy and detail to the reconstruction process, especially in non-line-of-sight images with the inclusion of higher-order scattering and physical processes such as Rayleigh–Sommerfeld diffraction8 in the modeling. However, these methods require experimental characterization of the scene followed by large computational overheads that produce images at low frame rates in the optical regime. In the radio-frequency (RF) regime, 3D images at frame rates of 30 Hz have been produced with an array of 256 wide-band transceivers3. Microwave imaging has the additional capability of sensing through optically opaque media such as walls. Nonetheless, synthetic aperture radar reconstruction algorithms such as the one proposed in ref. 3 required each transceiver in the array to operate individually thus leaving room for improvements in image frame rates from continuous transmit-receive captures. Constructions using beamforming have similar challenges9 where a narrow focused beam scans a scene using an array of antennas and frequency modulated continuous wave (FMCW) techniques.

In this article, we develop an inverse light transport model10 for microwave signals. The model uses a spatiotemporal mask generated by multiple sources, each emitting different PRBS codes, and a single detector, all operating in continuous synchronous transmit-receive mode. This model allows image reconstructions with capture times of the order of microseconds and no prior scene knowledge. For first-order reflections, the algorithm reduces to a single dot product between the reconstruction matrix and captured signal, and can be executed in a few milliseconds. We demonstrate this algorithm through simulations and measurements performed using realistic scenes in a laboratory setting. We then use the second-order terms of the light transport model to reconstruct scene details not captured by the first-order terms.

We start by estimating the information capacity of the scene and develop the light transport equation for the transient imaging model with arguments borrowed from basic information and electromagnetic field theory. Next, we describe the image reconstruction algorithm as a series of approximations corresponding to multiple scatterings of the spatiotemporal illumination matrix. Specifically, we show that in the first-order approximation, the value of each pixel is the dot product between the captured time series and a unique time signature generated by the spatiotemporal electromagnetic field mask. Next, we show how the second-order approximation generates hidden features not accessible in the first-order image. Finally, we apply the reconstruction algorithm to simulated and experimental data and discuss the performance, strengths, and limitations of this technique.

While chlorine and ultraviolet light are the standard means of disinfecting water, ozone is equally effective in killing germs. To date, ozone has only been used as an oxidation agent for treating water in large plants. Now, however, a project consortium from Schleswig-Holstein is developing a miniaturized ozone generator for use in smaller applications such as water dispensers or small domestic appliances. The Fraunhofer Institute for Silicon Technology ISIT has provided the sensor chip and electrode substrates for the electrolysis cell.

Compared to conventional means of disinfection such as chlorine or ultraviolet, ozone dissolved in water has a number of advantages: it is environmentally friendly, remains active beyond its immediate place of origin, has only a short retention time in water and is subsequently tasteless. Due to its high oxidation potential, ozone is very effective at combating germs. It breaks down the cell membrane of common pathogens. In Germany, ozone is chiefly used to disinfect swimming pools and drinking water and to purify wastewater. Yet it is rarely used to disinfect water in domestic appliances such as ice machines and beverage dispensers or in other fixtures such as shower-toilets. MIKROOZON, a project funded by the State of Schleswig-Holstein and the EU, aims to change this.

Circa 2017 using this can lead to near Ironman or foglet bodies with the ability to self heal the human body. It could be used on smartphones to heal people not needing a doctor in the future. This also would allow for the biological singularity to happen.


This device shoots new genetic code into cells to make them change their purpose. Researchers say the chip could someday be used to treat injuries in humans. But they’ve got a long, long way to go.

Researchers from the New York University School of Medicine have developed a brain implant designed to detect pain sensations in real-time and deliver bursts of pain-relieving stimulation. The device is still deeply experimental but a new proof of concept study demonstrates it working effectively in rodent models.

In the world of brain implants the chasm between science fiction and reality is still quite vast. Apart from some exciting human tests showing paralyzed individuals with implants regaining a sense of touch or controlling computers with their mind, most research in the field is still nascent.

Animal tests have demonstrated incremental technological advances, such as pigs broadcasting neural activity or monkeys playing Pong. Now, an interface that can detect pain signals in one part of the brain and immediately respond with stimulation to another part of the brain targeted to relieve that pain has been developed.

AMD & NVIDIA graphics card prices are showing signs of recovery with improves supply & availability as we enter the third quarter of 2021.


AMD & NVIDIA graphics card prices are showing signs of recovery as we enter the third quarter of 2021. Based on 3DCenter’s latest report, it looks like the worst is over and GPU supply is returning to normal as seen on various European retailers.

AMD & NVIDIA Graphics Card Prices Returning To Normal As GPU Market Shows Signs of Recovery, Availability and Supply Improving Too

Just last month, we had seen the graphics card prices peak to an insane three times over the MSRP. While GPU shortages were primarily to blame, the cryptocurrency and gaming demand both led to insane price hikes for AMD and NVIDIA graphics cards across the globe. NVIDIA’s GeForce graphics cards peaked at over 3x the MSRP while AMD graphics cards peaked at over 2x the MSRP but in the latest reports, we see continued improvement but there’s still a long way to go before we see these cards back at their official MSRPs.

The Hubble Space Telescope is currently offline.

On Sunday 13 June, the telescope’s payload computer went offline, and engineers here on Earth are currently performing operations to get it up and running again.

The payload computer, as you might expect, is vital to Hubble’s continued science operations. It’s the ‘brains’ of the instrument, coordinating and controlling the various instruments with which Hubble is equipped. It also monitors the telescope for issues.

As the number of qubits in early quantum computers increases, their creators are opening up access via the cloud. IBM has its IBM Q network, for instance, while Microsoft has integrated quantum devices into its Azure cloud-computing platform. By combining these platforms with quantum-inspired optimisation algorithms and variable quantum algorithms, researchers could start to see some early benefits of quantum computing in the fields of chemistry and biology within the next few years. In time, Google’s Sergio Boixo hopes that quantum computers will be able to tackle some of the existential crises facing our planet. “Climate change is an energy problem – energy is a physical, chemical process,” he says.

“Maybe if we build the tools that allow the simulations to be done, we can construct a new industrial revolution that will hopefully be a more efficient use of energy.” But eventually, the area where quantum computers might have the biggest impact is in quantum physics itself.

The Large Hadron Collider, the world’s largest particle accelerator, collects about 300 gigabytes of data a second as it smashes protons together to try and unlock the fundamental secrets of the universe. To analyse it requires huge amounts of computing power – right now it’s split across 170 data centres in 42 countries. Some scientists at CERN – the European Organisation for Nuclear Research – hope quantum computers could help speed up the analysis of data by enabling them to run more accurate simulations before conducting real-world tests. They’re starting to develop algorithms and models that will help them harness the power of quantum computers when the devices get good enough to help.