Toggle light / dark theme

Existing electronic skin (e-skin) sensing platforms are equipped to monitor physical parameters using power from batteries or near-field communication. For e-skins to be applied in the next generation of robotics and medical devices, they must operate wirelessly and be self-powered. However, despite recent efforts to harvest energy from the human body, self-powered e-skin with the ability to perform biosensing with Bluetooth communication are limited because of the lack of a continuous energy source and limited power efficiency. Here, we report a flexible and fully perspiration-powered integrated electronic skin (PPES) for multiplexed metabolic sensing in situ. The battery-free e-skin contains multimodal sensors and highly efficient lactate biofuel cells that use a unique integration of zero- to three-dimensional nanomaterials to achieve high power intensity and long-term stability. The PPES delivered a record-breaking power density of 3.5 milliwatt·centimeter−2 for biofuel cells in untreated human body fluids (human sweat) and displayed a very stable performance during a 60-hour continuous operation. It selectively monitored key metabolic analytes (e.g., urea, NH4+, glucose, and pH) and the skin temperature during prolonged physical activities and wirelessly transmitted the data to the user interface using Bluetooth. The PPES was also able to monitor muscle contraction and work as a human-machine interface for human-prosthesis walking.

Recent advances in robotics have enabled soft electronic devices at different scales with excellent biocompatibility and mechanical properties; these advances have rendered novel robotic functionalities suitable for various medical applications, such as diagnosis and drug delivery, soft surgery tools, human-machine interaction (HMI), wearable computing, health monitoring, assistive robotics, and prosthesis (1–6). Electronic skin (e-skin) can have similar characteristics to human skin, such as mechanical durability and stretchability and the ability to measure various sensations such as temperature and pressure (7–11). Moreover, e-skin can be augmented with capabilities beyond those of the normal human skin by incorporating advanced bioelectronics materials and devices.

While we might often take our sense of touch for granted, for researchers developing technologies to restore limb function in people paralyzed due to spinal cord injury or disease, re-establishing the sense of touch is an essential part of the process. And on April 23 in the journal Cell, a team of researchers at Battelle and the Ohio State University Wexner Medical Center report that they have been able to restore sensation to the hand of a research participant with a severe spinal cord injury using a brain-computer interface (BCI) system. The technology harnesses neural signals that are so miniscule they can’t be perceived and enhances them via artificial sensory feedback sent back to the participant, resulting in greatly enriched motor function.

“We’re taking subperceptual events and boosting them into conscious perception,” says first author Patrick Ganzer, a principal research scientist at Battelle. “When we did this, we saw several functional improvements. It was a big eureka moment when we first restored the participant’s .”

The participant in this study is Ian Burkhart, a 28-year-old man who suffered a spinal cord injury during a diving accident in 2010. Since 2014, Burkhart has been working with investigators on a project called NeuroLife that aims to restore function to his right arm. The device they have developed works through a system of electrodes on his skin and a small computer chip implanted in his motor cortex. This setup, which uses wires to route movement signals from the brain to the muscles, bypassing his spinal cord injury, gives Burkhart enough control over his arm and hand to lift a coffee mug, swipe a credit card, and play Guitar Hero.

Researchers from Carnegie Mellon and the University of Pittsburgh today published research showing how they’d solved a frustrating problem for people who use a brain-computer interface (BCI) to control prosthetic devices with their thoughts.

While the research itself is interesting – they created an algorithm that keeps the devices from constantly needing to be re-calibrated to handle the human brain’s fluctuating neuronal activity – the real takeaway here is how close we are to a universal BCI.

BCIs have been around for decades in one form or another, but they’re costly to maintain and difficult to keep working properly. Currently they only make sense for narrow use – specifically, in the case of those who’ve lost limbs. Because they’re already used to using their brain to control an appendage, it’s easier for scientists and researchers to harness those brainwaves to control prosthetic devices.

Meet 7-year-old Ethan, who received his bionic Hero Arm yesterday at the Hanger Clinic in Aurora, Illinois. Ethan contracted sepsis shortly after his second birthday and he was given a five per cent chance of survival, but the superhero in him fought and survived. Stay strong, Ethan, and may the Force be with you! ✨ 😍 💪 #EthanStrong

If you enjoyed this article or found it informative and wish to share it, you can do so from the following link: https://www.facebook.com/383136302314720/posts/564255487536133/


A.I. has already gotten to almost sci-fi levels of emulating brain activity, so much so that amputees can experience mind-controlled robotic arms, and neural networks might soon be a thing. That still wasn’t enough for the brains behind one ambitious startup, though.

Cortical Labs sounds like it could have been pulled from the future. Co-founder and CEO Hong Wen Chong and his team are merging biology and technology by embedding real neurons onto a specialized computer chip. Instead of being programmed to act like a human brain, it will use those neurons to think and learn and function on its own. The hybrid chips will save tremendous amounts of energy with an actual neuron doing the processing for them.