Toggle light / dark theme

I’m excited to share my new 1 hour interview at Singularity University radio with Steven Parton. Also, check out Singularity Hub and the write-up they did of the interview. We talk all things transhumanism, longevity, Cyborgs, and the future:


Singularity University, Singularity Hub, Singularity Summit, SU Labs, Singularity Labs, Exponential Medicine, Exponential Finance and all associated logos and design elements are trademarks and/or service marks of Singularity Education Group.

© 2019 Singularity Education Group. All Rights Reserved.

Singularity University is not a degree granting institution.

Our bodies do a decent enough job of repairing themselves, able to patch up wounds, fight off infections and even heal broken bones. But that only applies up to a certain point – lose a limb, for example, and it’s not coming back short of a prosthesis. Other creatures have mastered this skill though, and now scientists at the University of California Davis (UC Davis) and Harvard have sequenced the RNA transcripts for the immortal hydra and figured out how it manages to do just that.

Only these eyes aren’t human. They don’t blink or take breaks, and guided by artificial intelligence they can tell the difference between a dust cloud, an insect swarm and a plume of smoke that demands quick attention. In Brazil, the devices help keep mining giant Vale SA working, and protect trees for pulp and paper producer Suzano SA.

About 17 years ago, Keven Walgamott lost his left hand and part of his forearm in an electrical accident. Now, Walgamott can use his thoughts to tell the fingers of his bionic hand to pick up eggs and grapes. The prosthetic arm he tested also allowed Walgamott to feel the objects he grasped.

A biomedical engineering team at the University of Utah created the “LUKE Arm,” named in honor of the robotic hand Luke Skywalker obtains in “Star Wars: The Empire Strikes Back” after Darth Vader slices off his hand with a lightsaber.

A new study published Wednesday in the journal Science Robotics explained how the arm revived the sensation of touch for Walgamott. The University of Chicago and the Cleveland Clinic were also involved in the study.

Imagine a patient controlling the movement of his or her prosthetic limb simply by thinking of commands. It may sound like science fiction but will soon become reality thanks to the EU-funded DeTOP project. A consortium of engineers, neuroscientists and clinicians has made great strides in further developing the technology behind more natural and functional prostheses.

The team uses an osseointegrated human-machine gateway (OHMG) to develop a physical link between a person and a robotic prosthesis. A patient in Sweden was the first recipient of titanium implants with the OHMG system. The OHMG is directly fitted to bones in the user’s arms, from which electrodes to nerves and muscle extract signals to control a robotic hand and provide tactile sensations. According to a news item by “News Medical,” the patient will begin using a training prosthesis in the next few months before being fitted with the new artificial hand developed by DeTOP partners. This will help the team evaluate the entire system, including the implanted interface, electronics, as well as wrist and hand functions. Motor coordination and grip strength will also be assessed during the tests.

Research on robotic prostheses is coming along in leaps and bounds, but one hurdle is proving quite tricky to overcome: a sense of touch. Among other things, this sense helps us control our grip strength — which is vitally important when it comes to having fine motor control for handling delicate objects.

Enter a new upgrade for the LUKE Arm — named for Luke Skywalker, the Star Wars hero with a robotic hand. Prototype versions of this robotic prosthesis can be linked up to the wearer’s nerves.

And, thanks to biomedical engineers at the University of Utah, for the participants of their experimental study, the arm can now also produce an ability to feel. This spectacular advance allowed one wearer to handle grapes, peel a banana, and even feel his wife’s hand in his.

Researchers from RMIT University have drawn inspiration from optogenetics, an emerging tool in biotechnology, to develop a device that replicates the way the brain stores and loses information. Optogenetics allows scientists to delve into the body’s electrical system with incredible precision, using light to manipulate neurons so that they can be turned on or off.

The new is based on an ultra-thin material that changes electrical resistance in response to different wavelengths of light, enabling it to mimic the way neurons work to store and delete information in the brain. Research team leader Dr. Sumeet Walia said the technology has applications in (AI) technology that can harness the brain’s full sophisticated functionality.

“Our optogenetically-inspired chip imitates the fundamental biology of nature’s best computer—the human brain,” Walia said. “Being able to store, delete and process information is critical for computing, and the brain does this extremely efficiently. We’re able to simulate the brain’s neural approach simply by shining different colors onto our chip. This technology takes us further on the path towards fast, efficient and secure light-based computing. It also brings us an important step closer to the realization of a bionic brain—a brain-on-a-chip that can learn from its environment just like humans do.”

Auditory stimulus reconstruction is a technique that finds the best approximation of the acoustic stimulus from the population of evoked neural activity. Reconstructing speech from the human auditory cortex creates the possibility of a speech neuroprosthetic to establish a direct communication with the brain and has been shown to be possible in both overt and covert conditions. However, the low quality of the reconstructed speech has severely limited the utility of this method for brain-computer interface (BCI) applications. To advance the state-of-the-art in speech neuroprosthesis, we combined the recent advances in deep learning with the latest innovations in speech synthesis technologies to reconstruct closed-set intelligible speech from the human auditory cortex. We investigated the dependence of reconstruction accuracy on linear and nonlinear (deep neural network) regression methods and the acoustic representation that is used as the target of reconstruction, including auditory spectrogram and speech synthesis parameters. In addition, we compared the reconstruction accuracy from low and high neural frequency ranges. Our results show that a deep neural network model that directly estimates the parameters of a speech synthesizer from all neural frequencies achieves the highest subjective and objective scores on a digit recognition task, improving the intelligibility by 65% over the baseline method which used linear regression to reconstruct the auditory spectrogram. These results demonstrate the efficacy of deep learning and speech synthesis algorithms for designing the next generation of speech BCI systems, which not only can restore communications for paralyzed patients but also have the potential to transform human-computer interaction technologies.