Toggle light / dark theme

An artificial neural network (AI) designed by an international team involving UCL can translate raw data from brain activity, paving the way for new discoveries and a closer integration between technology and the brain.

The new method could accelerate discoveries of how brain activities relate to behaviors.

The study published today in eLife, co-led by the Kavli Institute for Systems Neuroscience in Trondheim and the Max Planck Institute for Human Cognitive and Brain Sciences Leipzig and funded by Wellcome and the European Research Council, shows that a , a specific type of deep learning , is able to decode many different behaviors and stimuli from a wide variety of brain regions in different species, including humans.

Summary: Study reveals how the brain analyzes different types of speech which may be linked to how we comprehend sentences and calculate mathematical equations.

Source: SfN

Separate math and language networks segregate naturally when listeners pay attention to one type over the other, according to research recently published in Journal of Neuroscience.

3D printed rockets save on up front tooling, enable rapid iteration, decrease part count, and facilitate radically new designs. For your chance to win 2 seats on one of the first Virgin Galactic flights to Space and support a great cause, go to https://www.omaze.com/veritasium.

Thanks to Tim Ellis and everyone at Relativity Space for the tour!
https://www.relativityspace.com/
https://youtube.com/c/RelativitySpace.

Special thanks to Scott Manley for the interview and advising on aerospace engineering.
Check out his channel: https://www.youtube.com/user/szyzyg.

▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
References:
Benson, T. (2021). Rocket Parts. NASA. — https://ve42.co/RocketParts.

Boen, B. (2009). Winter Wonder: Rocket Icicles. NASA. — https://ve42.co/EngineIcicles.

Hall, N. (2021). Rocket Thrust Equation. NASA. — https://ve42.co/RocketEqn.

Benson, T. (2021). Rocket Thrust. NASA. — https://ve42.co/RocketThrust.

Learn More


University of Advancing Technology’s Artificial Intelligence (AI) degree explores the theory and practice of engineering tools that simulate thinking, patterning, and advanced decision behaviors by software systems. With inspiration derived from biology to design, UAT’s Artificial Intelligence program teaches students to build software systems that solve complex problems. Students will work with technologies including voice recognition, simulation agents, machine learning (ML), and the internet of things (IoT).

Students pursuing this specialized computer programming degree develop applications using evolutionary and genetic algorithms, cellular automata, artificial neural networks, agent-based models, and other artificial intelligence methodologies. UAT’s degree in AI covers the fundamentals of general and applied artificial intelligence including core programming languages and platforms used in computer science.

New algorithm could enable fast, nimble drones for time-critical operations such as search and rescue.

If you follow autonomous drone racing, you likely remember the crashes as much as the wins. In drone racing, teams compete to see which vehicle is better trained to fly fastest through an obstacle course. But the faster drones fly, the more unstable they become, and at high speeds their aerodynamics can be too complicated to predict. Crashes, therefore, are a common and often spectacular occurrence.

But if they can be pushed to be faster and more nimble, drones could be put to use in time-critical operations beyond the race course, for instance to search for survivors in a natural disaster.

As reported in a new article in Nature Reviews Physics, instead of waiting for fully mature quantum computers to emerge, Los Alamos National Laboratory and other leading institutions have developed hybrid classical/quantum algorithms to extract the most performance—and potentially quantum advantage—from today’s noisy, error-prone hardware. Known as variational quantum algorithms, they use the quantum boxes to manipulate quantum systems while shifting much of the work load to classical computers to let them do what they currently do best: solve optimization problems.

“Quantum computers have the promise to outperform for certain tasks, but on currently available quantum hardware they can’t run long algorithms. They have too much noise as they interact with environment, which corrupts the information being processed,” said Marco Cerezo, a physicist specializing in , quantum machine learning, and quantum information at Los Alamos and a lead author of the paper. “With variational , we get the best of both worlds. We can harness the power of quantum computers for tasks that classical computers can’t do easily, then use classical computers to compliment the computational power of quantum devices.”

Current noisy, intermediate scale quantum computers have between 50 and 100 qubits, lose their “quantumness” quickly, and lack error correction, which requires more qubits. Since the late 1990s, however, theoreticians have been developing algorithms designed to run on an idealized large, error-correcting, fault tolerant quantum computer.

In this work, we introduce a classical variational method for simulating QAOA, a hybrid quantum-classical approach for solving combinatorial optimizations with prospects of quantum speedup on near-term devices. We employ a self-contained approximate simulator based on NQS methods borrowed from many-body quantum physics, departing from the traditional exact simulations of this class of quantum circuits.

We successfully explore previously unreachable regions in the QAOA parameter space, owing to good performance of our method near optimal QAOA angles. Model limitations are discussed in terms of lower fidelities in quantum state reproduction away from said optimum. Because of such different area of applicability and relative low computational cost, the method is introduced as complementary to established numerical methods of classical simulation of quantum circuits.

Classical variational simulations of quantum algorithms provide a natural way to both benchmark and understand the limitations of near-future quantum hardware. On the algorithmic side, our approach can help answer a fundamentally open question in the field, namely whether QAOA can outperform classical optimization algorithms or quantum-inspired classical algorithms based on artificial neural networks48,49,50.

3D printed rockets save on up front tooling, enable rapid iteration, decrease part count, and facilitate radically new designs. For your chance to win 2 seats on one of the first Virgin Galactic flights to Space and support a great cause, go to https://www.omaze.com/veritasium.

Thanks to Tim Ellis and everyone at Relativity Space for the tour!
https://www.relativityspace.com/
https://youtube.com/c/RelativitySpace.

Special thanks to Scott Manley for the interview and advising on aerospace engineering.
Check out his channel: https://www.youtube.com/user/szyzyg.

▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
References:
Benson, T. (2021). Rocket Parts. NASA. — https://ve42.co/RocketParts.

Boen, B. (2009). Winter Wonder: Rocket Icicles. NASA. — https://ve42.co/EngineIcicles.

Hall, N. (2021). Rocket Thrust Equation. NASA. — https://ve42.co/RocketEqn.