Toggle light / dark theme

Researchers at Tufts University’s School of Engineering have developed biomaterial-based inks that respond to and quantify chemicals released from the body (e.g. in sweat and potentially other biofluids) or in the surrounding environment by changing color. The inks can be screen printed onto textiles such as clothes, shoes, or even face masks in complex patterns and at high resolution, providing a detailed map of human response or exposure. The advance in wearable sensing, reported in Advanced Materials, could simultaneously detect and quantify a wide range of biological conditions, molecules and, possibly, pathogens over the surface of the body using conventional garments and uniforms.

“The use of novel bioactive inks with the very common method of screen printing opens up promising opportunities for the mass-production of soft, wearable fabrics with large numbers of sensors that could be applied to detect a range of conditions,” said Fiorenzo Omenetto, corresponding author and the Frank C. Doble Professor of Engineering at Tufts’ School of Engineering. “The fabrics can end up in uniforms for the workplace, sports clothing, or even on furniture and architectural structures.”

Wearable sensing devices have attracted considerable interest in monitoring human performance and health. Many such devices have been invented incorporating electronics in wearable patches, wristbands, and other configurations that monitor either localized or overall physiological information such as heart rate or blood glucose. The research presented by the Tufts team takes a different, complementary approach—non-electronic, colorimetric detection of a theoretically very large number of analytes using sensing garments that can be distributed to cover very large areas: anything from a patch to the entire body, and beyond.

When it comes to monitoring electrical activity in the brain, patients typically have to lie very still inside a large magnetoencephalography (MEG) machine. That could be about to change, though, as scientists have developed a new version of a wearable helmet that does the same job.

Back in 2018, researchers at Britain’s University of Nottingham revealed the original version of their “MEG helmet.”

The 3D-printed device was fitted with multiple sensors that allowed it to read the tiny magnetic fields created by brain waves, just like a regular MEG machine. Unlike the case with one of those, however, wearers could move around as those readings were taking place.

A thin, iron-based generator uses waste heat to provide small amounts of power.

Researchers have found a way to convert heat energy into electricity with a nontoxic material. The material is mostly iron which is extremely cheap given its relative abundance. A generator based on this material could power small devices such as remote sensors or wearable devices. The material can be thin so it could be shaped into various forms.

There’s no such thing as a free lunch, or free energy. But if your energy demands are low enough, say for example in the case of a small sensor of some kind, then there is a way to harness heat energy to supply your power without wires or batteries. Research Associate Akito Sakai and group members from his laboratory at the University of Tokyo Institute for Solid State Physics and Department of Physics, led by Professor Satoru Nakatsuji, and from the Department of Applied Physics, led by Professor Ryotaro Arita, have taken steps towards this goal with their innovative iron-based thermoelectric material.

More portable, fully wireless smart home setups. Lower power wearables. Batteryless smart devices. These could all be made possible thanks to a new ultra-low power Wi-Fi radio developed by electrical engineers at the University of California San Diego.

The device, which is housed in a chip smaller than a grain of rice, enables Internet of Things (IoT) devices to communicate with existing Wi-Fi networks using 5,000 times less than today’s Wi-Fi radios. It consumes just 28 microwatts of power. And it does so while transmitting data at a rate of 2 megabits per second (a connection fast enough to stream music and most YouTube videos) over a range of up to 21 meters.

The team will present their work at the ISSCC 2020 conference Feb. 16 to 20 in San Francisco.

Roam Robotics is making robotic exoskeletons that are lightweight and affordable so that they can become a new category of consumer electronics. Traditional robotic exoskeletons can weigh between 30 to 60 pounds because they rely on high precision mechanical systems. They are big and bulky and cost as much as a luxury car, which significantly limits their usefulness and availability. Roam’s new robotic exoskeletons are so portable and inexpensive that they could quickly become a commonplace part of modern life.

In an effort to create first-of-kind microelectronic devices that connect with biological systems, University of Maryland (UMD) researchers are utilizing CRISPR technology in a novel way to electronically turn “on” and “off” several genes simultaneously. Their technique, published in Nature Communications, has the potential to further bridge the gap between the electronic and biological worlds, paving the way for new wearable and “smart” devices.

“Faced with the COVID-19 pandemic, we now have an even deeper understanding of how ‘smart’ devices could benefit the general population,” said William E. Bentley, professor in UMD’s Fischell Department of Bioengineering and Institute for Bioscience and Biotechnology Research (IBBR), and director of the Robert E. Fischell Institute for Biomedical Devices. “Imagine what the world would be like if we could wear a device and access an app on our smartphone capable of detecting whether the wearer has the active virus, generated immunity, or has not been infected. We don’t have this yet, but it is increasingly clear that a suite of technologies enabling rapid transfer of information between biology and electronics is needed to make this a reality.”

With such a , this information could be used, for example, to dynamically and autonomously conduct effective contact tracing, Bentley said.

Stroke is the leading cause of serious long-term disability in the US with approximately 17 million individuals experiencing it each year. About 8 out of 10 stroke survivors suffer from “hemiparesis”, a paralysis that typically impacts the limbs and facial muscles on one side of their bodies, and often causes severe difficulties walking, a loss of balance with an increased risk of falling, as well as muscle fatigue that quickly sets in during exertions. Oftentimes, these impairments also make it impossible for them to perform basic everyday activities.

To allow to recover, many rehabilitation centers have looked to robotic exoskeletons. But although there are now a range of exciting devices that are enabling people to walk again who initially were utterly unable to do so, there remains significant active research trying to understand how to best apply wearable robotics for rehabilitation after stroke. Despite the promise, recent clinical practice guidelines now even recommend against the use of robotic therapies when the goal is to improve walking speed or distance.

In 2017, a multidisciplinary team of mechanical and electrical engineers, apparel designers, and neurorehabilitation experts at Harvard’s Wyss Institute for Biologically Inspired Engineering and John A. Paulson School of Engineering and Applied Sciences (SEAS), and Boston University’s (BU) College of Health & Rehabilitation Sciences: Sargent College showed that an ankle-assisting soft robotic exosuit, tethered to an external battery and motor, was able to significantly improve biomechanical gait functions in stroke patients when worn while walking on a treadmill. The cross-institutional and cross-disciplinary team effort was led by Wyss faculty members Conor Walsh, Ph.D. and Lou Awad, P.T., D.P.T., Ph.D, together with Terry Ellis, Ph.D., P.T., N.C.S. from BU.

The team saw some early successes regarding movement — the initial goal of the BCI — allowing Burkhart to press buttons along the neck of a “Guitar Hero” controller.

But returning touch to his hand was a much more daunting task. By using a simple vibration device or “wearable haptic system,” Burkhart was able to tell if he was touching an object or not without seeing it.

“It’s definitely strange,” Burkhart told Wired. “It’s still not normal, but it’s definitely much better than not having any sensory information going back to my body.”

A Korean research team has developed a lithium-ion battery that is flexible enough to be stretched. Dr. Jeong Gon Son’s research team at the Photo-Electronic Hybrids Research Center at the Korea Institute of Science and Technology (KIST) announced that they had constructed a high-capacity, stretchable lithium-ion battery. The battery was developed by fabricating a structurally stretchable electrode consisting solely of electrode materials and then assembling with a stretchable gel electrolyte and stretchable packaging.

Rapid technological advancements in the electronics industry have led to a fast-growing market for high-performance wearable devices, such as smart bands and body-implantable electronic devices, such as pacemakers. These advancements have considerably increased the need for energy storage devices to be designed in flexible and stretchable forms that mimic human skin and organs.

However, it is very difficult to impart stretchability to the because the solid inorganic material occupies most of the volume, and other components such as current collectors and separators must also be made stretchable. In addition, the problem of liquid electrolyte leakage under deformation must also be solved, as well as the problem of leaking liquid .

Albert Einstein famously postulated that “the only real valuable thing is intuition,” arguably one of the most important keys to understanding intention and communication.

But intuitiveness is hard to teach—especially to a machine. Looking to improve this, a team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) came up with a method that dials us closer to more seamless human– collaboration. The system, called “Conduct-A-Bot,” uses human signals from wearable sensors to pilot a robot’s movement.

“We envision a world in which machines help people with cognitive and physical work, and to do so, they adapt to people rather than the other way around,” says Professor Daniela Rus, director of CSAIL, deputy dean of research for the MIT Stephen A. Schwarzman College of Computing, and co-author on a paper about the system.