Toggle light / dark theme

Princeton researchers have uncovered new rules governing how objects absorb and emit light, fine-tuning scientists’ control over light and boosting research into next-generation solar and optical devices.

The discovery solves a longstanding problem of scale, where light’s behavior when interacting with violates well-established physical constraints observed at larger scales.

“The kinds of effects you get for very small objects are different from the effects you get from very large objects,” said Sean Molesky, a postdoctoral researcher in electrical engineering and the study’s first author. The difference can be observed in moving from a molecule to a grain of sand. “You can’t simultaneously describe both things,” he said.

Engineering Food: The Impossible Whopper.

“Now, let’s compare the estrogen hormone in an impossible whopper to the whopper made from hormone implanted beef. The impossible whopper has 44 mg of estrogen and the whopper has 2.5 ng of estrogen. Now let me refresh your metric system. There are 1 million nanograms (ng) in one milligram (mg). That means an impossible whopper has 18 million times as much estrogen as a regular whopper. Just six glasses of soy milk per day has enough estrogen to grow boobs on a male. That’s the equivalent of eating four impossible whoppers per day. You would have to eat 880 pounds of beef from an implanted steer to equal the amount of estrogen in one birth control pill.”


The impossible whopper is being advertised by Burger King as a plant based alternative to the whopper. When food manufacturers started talking about making artificial meat, I, too, thought it would be impossible to make a hamburger cheaply enough to make it competitive. You see, I assumed that they would have to buy the individual amino acids (the building blocks for protein) and chemically string them together in the proper order, then remove the reagents (chemicals needed to cause the chain reactions) and then add something to give it the right textures.

The impossible whopper (made by Impossible Foods) bypassed all of those steps. Let’s compare the two. The impossible whopper patty is made from 24 ingredients. The most important ingredient is soy protein. The whopper patty has just one ingredient. That would be beef.

The impossible whopper has 630 calories, mostly from the added oils. The whopper has 660 calories. So, about 5% less calories, this is not a huge improvement.

Nowadays, there is an imperative need for novel computational concepts to manage the enormous data volume produced by contemporary information technologies. The inherent capability of the brain to cope with these kinds of signals constitutes the most efficient computational paradigm for biomimicry.

Representing neuronal processing with software-based artificial neural networks is a popular approach with tremendous impacts on everyday life; a field commonly known as machine learning or artificial intelligence. This approach relies on executing algorithms that represent neural networks on a traditional von Neumann computer architecture.

An alternative approach is the direct emulation of the workings of the brain with actual electronic devices/circuits. This emulation of the brain at the hardware-based level is not only necessary for overcoming limitations of conventional silicon technology based on the traditional von Neumann architecture in terms of scaling and efficiency, but in understanding brain function through reverse engineering. This hardware-based approach constitutes the main scope of neuromorphic devices/computing.

Any comments?


Ultraprecise 3D printing technology is a key enabler for manufacturing precision biomedical and photonic devices. However, the existing printing technology is limited by its low efficiency and high cost. Professor Shih-Chi Chen and his team from the Department of Mechanical and Automation Engineering, The Chinese University of Hong Kong (CUHK), collaborated with the Lawrence Livermore National Laboratory to develop the Femtosecond Projection Two-photon Lithography (FP-TPL) printing technology.

By controlling the spectrum via temporal focusing, the laser 3D printing process is performed in a parallel layer-by-layer fashion instead of point-by-point writing. This new technique substantially increases the printing speed by 1,000—10,000 times, and reduces the cost by 98 percent. The achievement has recently been published in Science, affirming its technological breakthrough that leads nanoscale 3D printing into a new era.

The conventional nanoscale 3D , i.e., two-photon polymerization (TPP), operates in a point-by-point scanning fashion. As such, even a centimeter-sized object can take several days to weeks to fabricate (build rate ~ 0.1 mm3/hour). The process is time-consuming and expensive, which prevents practical and industrial applications. To increase speed, the resolution of the finished product is often sacrificed. Professor Chen and his team have overcome the challenging problem by exploiting the concept of temporal focusing, where a programmable femtosecond light sheet is formed at the focal plane for parallel nanowriting; this is equivalent to simultaneously projecting millions of laser foci at the , replacing the traditional method of focusing and scanning laser at one point only. In other words, the FP-TPL technology can fabricate a whole plane within the time that the point-scanning system fabricates a point.

Since the end of the 19th century, physicists have known that the transfer of energy from one body to another is associated with entropy. It quickly became clear that this quantity is of fundamental importance, and so began its triumphant rise as a useful theoretical quantity in physics, chemistry and engineering. However, it is often very difficult to measure. Professor Dietmar Block and Frank Wieben of Kiel University (CAU) have now succeeded in measuring entropy in complex plasmas, as they reported recently in the renowned scientific journal Physical Review Letters. In a system of charged microparticles within this ionized gas, the researchers were able to measure all positions and velocities of the particles simultaneously. In this way, they were able to determine the entropy, as it was already described theoretically by the physicist Ludwig Boltzmann around 1880.

Surprising thermodynamic equilibrium in plasma

“With our experiments, we were able to prove that in the important model system of complex , the thermodynamic fundamentals are fulfilled. What is surprising is that this applies to microparticles in a plasma, which is far away from thermodynamic equilibrium,” explains Ph.D. student Frank Wieben. In his experiments, he is able to adjust the thermal motion of the microparticles by means of a laser beam. Using video microscopy, he can observe the dynamic behaviour of the particles in real time, and determine the from the information collected.

John Giannandrea, Vice President of Engineering with responsibility for Google’s Computer Science Research and Machine Intelligence groups; leading teams in Machine Learning, Machine Intelligence, Computer Perception, Natural Language Understanding, and Quantum Computing, “I’m definitely not worried about the AI apocalypse, I just object to the hype and soundbites that some people are making” said at the TechCrunch Disrupt conference in San Francisco.

Google’s John Giannandrea sits down with Frederic Lardinois to discuss the AI hype/worry cycle and the importance, limitations, and acceleration of machine learning.

The world’s first fully electric commercial aircraft took its inaugural test flight on Tuesday, taking off from the Canadian city of Vancouver and offering hope that airlines may one day end their polluting emissions.

“This proves that commercial aviation in all-electric form can work,” said Roei Ganzarski, chief executive of Seattle-based engineering firm magniX.

The company designed the plane’s motor and worked in partnership with Harbour Air, which ferries half a million passengers a year between Vancouver, Whistler ski resort and nearby islands and coastal communities.

After decades of miniaturization, the electronic components we’ve relied on for computers and modern technologies are now starting to reach fundamental limits. Faced with this challenge, engineers and scientists around the world are turning toward a radically new paradigm: quantum information technologies.

Quantum technology, which harnesses the strange rules that govern particles at the , is normally thought of as much too delicate to coexist with the electronics we use every day in phones, laptops and cars. However, scientists with the University of Chicago’s Pritzker School of Molecular Engineering announced a significant breakthrough: Quantum states can be integrated and controlled in commonly used made from silicon carbide.

“The ability to create and control high-performance quantum bits in commercial electronics was a surprise,” said lead investigator David Awschalom, the Liew Family Professor in Molecular Engineering at UChicago and a pioneer in quantum technology. “These discoveries have changed the way we think about developing quantum technologies—perhaps we can find a way to use today’s electronics to build quantum devices.”

The first ever integrated nanoscale device which can be programmed with either photons or electrons has been developed by scientists in Harish Bhaskaran’s Advanced Nanoscale Engineering research group at the University of Oxford.

In collaboration with researchers at the universities of Münster and Exeter, scientists have created a first-of-a-kind electro– which bridges the fields of optical and electronic computing. This provides an elegant solution to achieving faster and more energy efficient memories and processors.

Computing at the has been an enticing but elusive prospect, but with this development it’s now in tangible proximity. Using light to encode as well as transfer information enables these processes to occur at the ultimate speed limit—that of light. While as of recently, using light for certain processes has been experimentally demonstrated, a compact device to interface with the electronic architecture of traditional computers has been lacking. The incompatibility of electrical and light-based computing fundamentally stems from the different interaction volumes that electrons and photons operate in. Electrical chips need to be small to operate efficiently, whereas need to be large, as the wavelength of light is larger than that of electrons.