When the Holiday season kicks off next fall (2017); I have a feeling that I may end up buying a Penny Robot or a BMI controlled drone for my niece & nephews.
The post is also available in: Hebrew :הכתבה זמינה גם ב
A new research out of Arizona State University with DARPA funding.
Using a skullcap fitted with 128 electrodes wired to a computer, researchers are able to control multiple drones using human thought and vision to guide the quadcopters wirelessly. The device records electrical brain activity and measures the movement of the drones based on parts of the brain that light up. This signal is monitored and sent to another computer that transmits a signal to the drones, making them move. Panagiotis Artemiadis, director of the Human-Oriented Robotics and Control Lab and an assistant professor of mechanical and aerospace engineering at the School for Engineering of Matter, Transport and Energy in the Ira A. Fulton Schools of Engineering, has been working with funding from the Defense Advanced Research Projects Agency (DARPA) and U.S. Air Force to develop this technology. Artemiadis has been working on brain-to-machine interfaces since 2009, but only recently made the leap to controlling more than one device.