Toggle light / dark theme

DARPA, the Department of Defense’s research arm, is paying scientists to invent ways to instantly read soldiers’ minds using tools like genetic engineering of the human brain, nanotechnology and infrared beams. The end goal? Thought-controlled weapons, like swarms of drones that someone sends to the skies with a single thought or the ability to beam images from one brain to another.

This week, DARPA (Defense Advanced Research Projects Agency) announced that six teams will receive funding under the Next-Generation Nonsurgical Neurotechnology (N3) program. Participants are tasked with developing technology that will provide a two-way channel for rapid and seamless communication between the human brain and machines without requiring surgery.

“Imagine someone who’s operating a drone or someone who might be analyzing a lot of data,” said Jacob Robinson, an assistant professor of bioengineering at Rice University, who is leading one of the teams. [DARPA’s 10 Coolest Projects: From Humanoid Robots to Flying Cars].

Read more

Facebook removed more than 3 billion fake accounts from October to March, twice as many as the previous six months, the company said Thursday.

Nearly all of them were caught before they had a chance to become “active” users of the social network.

In a new report, Facebook said it saw a “steep increase” in the creation of abusive, fake accounts in the past six months. While most of these fake accounts were blocked “within minutes” of their creation, the company said this increase of “automated attacks” by bad actors meant not only that it caught more of the fake accounts, but that more of them slipped through the cracks.

Read more

New research from the laboratory of Ozgur Sahin, associate professor of biological sciences and physics at Columbia University, shows that materials can be fabricated to create soft actuators—devices that convert energy into physical motion—that are strong and flexible, and, most important, resistant to water damage.

“There’s a growing trend of making anything we interact with and touch from materials that are dynamic and responsive to the environment,” Sahin says. “We found a way to develop a material that is water-resistant yet, at the same time, equipped to harness water to deliver the force and motion needed to actuate .”

The research was published online May 21 in Advanced Materials Technologies.

Read more

Autonomous vehicles might someday be able to navigate bustling city streets to deliver groceries, pizzas, and other packages without a human behind the wheel. But that doesn’t solve what Ford Motor CTO Ken Washington describes as the last 50-foot problem.

Ford and startup Agility Robotics are partnering in a research project that will test how two-legged robots and self-driving vehicles can work together to solve that curb-to-door problem. Agility’s Digit, a two-legged robot that has a lidar where its head should be, will be used in the project. The robot, which is capable of lifting 40 pounds, can ride along in a self-driving vehicle and be deployed when needed to delivery packages.

“We’re looking at the opportunity of autonomous vehicles through the lens of the consumer and we know from some early experimentation that there are challenges with the last 50 feet,” Washington told TechCrunch in a recent interview. Finding a solution could be an important differentiator for Ford’s commercial robotaxi service, which it plans to launch in 2021.

Read more

SpotMini autonomously navigates a specified route through an office and lab facility. Before the test, the robot is manually driven through the space so it can build a map of the space using visual data from cameras mounted on the front, back and sides of the robot. During the autonomous run, SpotMini uses data from the cameras to localize itself in the map and to detect and avoid obstacles. Once the operator presses ‘GO’ at the beginning of the video, the robot is on its own. Total walk time for this route is just over 6 minutes. (The QR codes visible in the video are used to measure performance, not for navigation.)

Read more

The work should lead to control one to a few hundred atoms at microsecond timescales using AI control of electron beams. The computational/analytical framework developed in this work are general and can further help develop techniques for controlling single-atom dynamics in 3D materials, and ultimately, upscaling manipulations of multiple atoms to assemble 1 to 1000 atoms with high speed and efficacy.

Scientists at MIT, the University of Vienna, and several other institutions have taken a step toward developing a method that can reposition atoms with a highly focused electron beam and control their exact location and bonding orientation. The finding could ultimately lead to new ways of making quantum computing devices or sensors, and usher in a new age of “atomic engineering,” they say.

This could help make quantum sensors and computers.

Read more