Toggle light / dark theme

TAE Technologies, the California, USA-based fusion energy technology company, has announced that its proprietary beam-driven field-reversed configuration (FRC) plasma generator has produced stable plasma at over 50 million degrees Celsius. The milestone has helped the company raise USD280 million in additional funding.

Norman — TAE’s USD150 million National Laboratory-scale device named after company founder, the late Norman Rostoker — was unveiled in May 2017 and reached first plasma in June of that year. The device achieved the latest milestone as part of a “well-choreographed sequence of campaigns” consisting of over 25000 fully-integrated fusion reactor core experiments. These experiments were optimised with the most advanced computing processes available, including machine learning from an ongoing collaboration with Google (which produced the Optometrist Algorithm) and processing power from the US Department of Energy’s INCITE programme that leverages exascale-level computing.

Plasma must be hot enough to enable sufficiently forceful collisions to cause fusion and sustain itself long enough to harness the power at will. These are known as the ‘hot enough’ and ‘long enough’ milestone. TAE said it had proved the ‘long enough’ component in 2015, after more than 100000 experiments. A year later, the company began building Norman, its fifth-generation device, to further test plasma temperature increases in pursuit of ‘hot enough’.

Elon Musk finally got to show off his monkey.

Neuralink, a company founded by Musk that is developing artificial-intelligence-powered microchips to go in people’s brains, released a video Thursday appearing to show a macaque using the tech to play video games, including “Pong.”

Musk has boasted about Neuralink’s tests on primates before, but this is the first time the company has put one on display. During a presentation in 2019, Musk said the company had enabled a monkey to “control a computer with its brain.”

NGAD is the Navy’s effort to replace the Super Hornet. Note: It’s a completely separate program from the Air Force’s own NGAD—which recently designed, tested, and flew a secret new fighter jet—and will produce a completely separate plane. The two aircraft will almost certainly be quite different, with the Air Force’s jet more optimized for air superiority. It’s likely the two fighters, developed roughly within the same time period, will share much of the same technology.


The U.S. Navy elaborated on its plans to replace the F/A-18E/F Super Hornet, saying the service’s next strike fighter will “most likely be manned.” The jet will probably fly alongside robotic allies, and remotely crewed aircraft could eventually account for six out of 10 planes on a carrier flight deck.

“As we look at it right now, the Next-Gen Air Dominance [NGAD] is a family of systems, which has as its centerpiece the F/A-XX—which may or may not be manned—platform. It’s the fixed-wing portion of the Next-Gen Air Dominance family of systems,” said Rear Adm. Gregory Harris, the head of the Chief of Naval Operations’ air warfare directorate, during a Navy League event.

The F/A-18E/F Super Hornet dominates Navy’s strike fighter fleet, made up of fighters that can execute both fighter and attack missions. Although the Navy is buying the F-35C Joint Strike Fighter, it’s only purchasing enough of the planes to replace one or two of the four strike fighter squadrons per deployed aircraft carrier. The Navy believes it needs to replace the Super Hornet and its electronic warfare variant, the EA-18G Growler, in the 2030s.

Rice University computer scientists have demonstrated artificial intelligence (AI) software that runs on commodity processors and trains deep neural networks 15 times faster than platforms based on graphics processors.

“The cost of training is the actual bottleneck in AI,” said Anshumali Shrivastava, an assistant professor of computer science at Rice’s Brown School of Engineering. “Companies are spending millions of dollars a week just to train and fine-tune their AI workloads.”

Shrivastava and collaborators from Rice and Intel will present research that addresses that bottleneck April 8 at the machine learning systems conference MLSys.

Three years of underground robotics competitions culminate in a final event in September with $5 million in prize money.


The DARPA Subterranean Challenge Final Event is scheduled to take place at the Louisville Mega Cavern in Louisville, Kentucky, from September 21 to 23. We’ve followed SubT teams as they’ve explored their way through abandoned mines, unfinished nuclear reactors, and a variety of caves, and now everything comes together in one final course where the winner of the Systems Track will take home the $2 million first prize.

It’s a fitting reward for teams that have been solving some of the hardest problems in robotics, but winning isn’t going to be easy, and we’ll talk with SubT Program Manager Tim Chung about what we have to look forward to.

Since we haven’t talked about SubT in a little while (what with the unfortunate covid-related cancellation of the Systems Track Cave Circuit), here’s a quick refresher of where we are: the teams have made it through the Tunnel Circuit, the Urban Circuit, and a virtual version of the Cave Circuit, and some of them have been testing in caves of their own. The Final Event will include all of these environments, and the teams of robots will have 60 minutes to autonomously map the course, locating artifacts to score points. Since I’m not sure where on Earth there’s an underground location that combines tunnels and caves with urban structures, DARPA is going to have to get creative, and the location in which they’ve chosen to do that is Louisville, Kentucky.

Like something straight out of a pulpy sci-fi horror flick, researchers at Tufts University and the University of Vermont (UVM) have engineered a new generation of living robots they call Xenobots, which demonstrate cooperative swarm activity while collecting piles of micro particles.

Last year, this same team of scientists and biologists created tiny self-healing bio-machines that exhibited movement, payload pushing abilities, and a sort of hive mentality. The blueprints for creating these biological bots, which technically aren’t a typical robot or a catalogued animal species, but instead are more akin to a distinct class of unique artifact that acts as a living, programmable organism.