Toggle light / dark theme

Researchers at UC San Francisco have successfully developed a “speech neuroprosthesis” that has enabled a man with severe paralysis to communicate in sentences, translating signals from his brain to the vocal tract directly into words that appear as text on a screen.

The achievement, which was developed in collaboration with the first participant of a clinical research trial, builds on more than a decade of effort by UCSF neurosurgeon Edward Chang, MD, to develop a technology that allows people with paralysis to communicate even if they are unable to speak on their own. The study appears July 15 in the New England Journal of Medicine.

Once studied by Charles Darwin, the Venus flytrap is perhaps the most famous plant that moves at high speed. But as Daniel Rayneau-Kirkhope explains, researchers are still unearthing new scientific insights into plant motion, which could lead to novel, bio-inspired robotic structures.

“In the absence of any other proof,” Isaac Newton is once said to have proclaimed, “the thumb alone would convince me of God’s existence.” With 29 bones, 123 ligaments and 34 muscles pulling the strings, the human hand is indeed a feat of nature’s engineering. It lets us write, touch, hold, feel and interact in exquisite detail with the world around us.

To replicate the wonders of the human hand, researchers in the field of “soft robotics” are trying to design artificial structures made from flexible, compliant materials that can be controlled and programmed by computers. Trouble is, the hand is such a complex structure that it needs lots of computing power to be properly controlled. That’s a problem when developing prosthetic hands for people who have lost an arm in, say, an accident or surgery.

Humans are integrating with technology. Not in the future – now. With the emergence of custom prosthetics that make us stronger and faster, neural implants that change how our brains work, and new senses and abilities that you’ve never dreamed of having, it’s time to start imagining what a better version of you might look like.


From reality-enhancing implants to brain-controlled exoskeletons, breakthroughs in bio-tech have fuelled a new fusion of machinery and organic matter.

It’s an astonishing achievement — and in an eyebrow-raising twist, Simons says he plans to live forever, by turning himself into a cyborg.

It sounds like Simons has thought out his plan.

“This is the first puzzle piece in my goal of replacing body parts with mechanical parts,” Simons told De Telegraaf, adding that his goal is “immortality.”

Over the past few decades, roboticists and computer scientists have developed artificial systems that replicate biological functions and human abilities in increasingly realistic ways. This includes artificial intelligence systems, as well as sensors that can capture various types of sensory data.

When trying to understand properties of objects and how to grasp them or handle them, humans often rely on their sense of touch. Artificial sensing systems that replicate human touch can thus be of great value, as they could enable the development of better performing and more responsive robots or prosthetic limbs.

Researchers at Sungkyunkwan University and Hanyang University in South Korea have recently created an artificial tactile sensing system that mimics the way in which humans recognize objects in their surroundings via their sense of touch. This system, presented in a paper published in Nature Electronics, uses to capture data associated with the tactile properties of objects.

Imagine clothing that can warm or cool you, depending on how you’re feeling. Or artificial skin that responds to touch, temperature, and wicks away moisture automatically. Or cyborg hands controlled with DNA motors that can adjust based on signals from the outside world.

Welcome to the era of intelligent matter—an unconventional AI computing idea directly woven into the fabric of synthetic matter. Powered by brain-based computing, these materials can weave the skins of soft robots or form microswarms of drug-delivering nanobots, all while reserving power as they learn and adapt.

Sound like sci-fi? It gets weirder. The crux that’ll guide us towards intelligent matter, said Dr. W.H.P. Pernice at the University of Munster and colleagues, is a distributed “brain” across the material’s “body”— far more alien than the structure of our own minds.

In the very last moments of the movie, however, you would also see something unusual: the sprouting of clouds of satellites, and the wrapping of the land and seas with wires made of metal and glass. You would see the sudden appearance of an intricate artificial planetary crust capable of tremendous feats of communication and calculation, enabling planetary self-awareness — indeed, planetary sapience.

The emergence of planetary-scale computation thus appears as both a geological and geophilosophical fact. In addition to evolving countless animal, vegetal and microbial species, Earth has also very recently evolved a smart exoskeleton, a distributed sensory organ and cognitive layer capable of calculating things like: How old is the planet? Is the planet getting warmer? The knowledge of “climate change” is an epistemological accomplishment of planetary-scale computation.

Over the past few centuries, humans have chaotically and in many cases accidentally transformed Earth’s ecosystems. Now, in response, the emergent intelligence represented by planetary-scale computation makes it possible, and indeed necessary, to conceive an intentional, directed and worthwhile planetary-scale terraforming. The vision for this is not to be found in computing infrastructure itself, but in the purposes to which we put it.

Stimulation of the nervous system with neurotechnology has opened up new avenues for treating human disorders, such as prosthetic arms and legs that restore the sense of touch in amputees, prosthetic fingertips that provide detailed sensory feedback with varying touch resolution, and intraneural stimulation to help the blind by giving sensations of sight.

Scientists in a European collaboration have shown that optic nerve stimulation is a promising neurotechnology to help the blind, with the constraint that current technology has the capacity of providing only simple visual signals.

Nevertheless, the scientists’ vision (no pun intended) is to design these simple visual signals to be meaningful in assisting the blind with daily living. Optic nerve stimulation also avoids invasive procedures like directly stimulating the brain’s visual cortex. But how does one go about optimizing stimulation of the optic nerve to produce consistent and meaningful visual sensations?

Now, the results of a collaboration between EPFL, Scuola Superiore Sant’Anna and Scuola Internazionale Superiore di Studi Avanzati, published today in Patterns, show that a new stimulation protocol of the optic nerve is a promising way for developing personalized visual signals to help the blind–that also take into account signals from the visual cortex. The protocol has been tested for the moment on artificial neural networks known to simulate the entire visual system, called convolutional neural networks (CNN) usually used in computer vision for detecting and classifying objects. The scientists also performed psychophysical tests on ten healthy subjects that imitate what one would see from optic nerve stimulation, showing that successful object identification is compatible with results obtained from the CNN.

“We are not just trying to stimulate the optic nerve to elicit a visual perception,” explains Simone Romeni, EPFL scientist and first author of the study. “We are developing a way to optimize stimulation protocols that takes into account how the entire visual system responds to optic nerve stimulation.”

“The research shows that you can optimize optic nerve stimulation using machine learning approaches. It shows more generally the full potential of machine learning to optimize stimulation protocols for neuroprosthetic devices,” continues Silvestro Micera, EPFL Bertarelli Foundation Chair in Translational Neural Engineering and Professor of Bioelectronics at the Scuola Superiore Sant’Anna.

Hugh Herr is building the next generation of bionic limbs, robotic prosthetics inspired by nature’s own designs. Herr lost both legs in a climbing accident 30 years ago; now, as the head of the MIT Media Lab’s Biomechatronics group, he shows his incredible technology with the help of ballroom dancer Adrianne Haslet-Davis, who lost her left leg in the 2013 Boston Marathon bombing.