Toggle light / dark theme

Robot bees are no replacement for our vital pollinators here on Earth. Up on the International Space Station, however, robots bearing the bee name could help spacefaring humans save precious time.

On Friday, NASA astronaut Anne McClain took one of the trio of Astrobees out for a spin. Bumble and its companion Honey both arrived on the ISS a month ago, and are currently going through a series of checks. Bumble passed the first hurdle when McClain manually flew it around the Japanese Experiment Module. Bumble took photos of the module which will be used to make a map for all the Astrobees, guiding them as they begin their tests there.

The three cube-shaped robots (Queen will arrive from Earth in the SpaceX resupply mission this July) don’t look anything like their namesakes, but they are non-threatening by design, says Astrobee project manager Maria Bualat. Since they’re built to fly around autonomously, doing tasks for the crew of the International Space Station, “one of our hardest problems is actually dealing with safety concerns,” she says.

Read more

Putting their own twist on robots that amble through complicated landscapes, the Stanford Student Robotics club’s Extreme Mobility team has developed a four-legged robot that is not only capable of performing acrobatic tricks and traversing challenging terrain but is also designed with reproducibility in mind. Anyone who wants their own version of the robot, dubbed Stanford Doggo, can consult comprehensive plans, code and a supply list that the students have made freely available online.

“We had seen these other quadruped robots used in research, but they weren’t something that you could bring into your own lab and use for your own projects,” said Nathan Kau, ‘20, a major and lead for Extreme Mobility. “We wanted Stanford Doggo to be this open source that you could build yourself on a relatively small budget.”

Whereas other similar robots can cost tens or hundreds of thousands of dollars and require customized parts, the Extreme Mobility students estimate the cost of Stanford Doggo at less than $3,000—including manufacturing and shipping costs—and nearly all the components can be bought as-is online. They hope the accessibility of these resources inspires a community of Stanford Doggo makers and researchers who develop innovative and meaningful spinoffs from their work.

Read more

An algorithm developed by Brown University computer scientists enables robots to put pen to paper, writing words using stroke patterns similar to human handwriting. It’s a step, the researchers say, toward robots that are able to communicate more fluently with human co-workers and collaborators.

“Just by looking at a target image of a word or sketch, the robot can reproduce each stroke as one continuous action,” said Atsunobu Kotani, an undergraduate student at Brown who led the algorithm’s development. “That makes it hard for people to distinguish if it was written by the robot or actually written by a human.”

The algorithm makes use of deep learning networks that analyze images of handwritten words or sketches and can deduce the likely series of pen strokes that created them. The robot can then reproduce the words or sketches using the pen strokes it learned. In a paper to be presented at this month’s International Conference on Robotics and Automation, the researchers demonstrate a robot that was able to write “hello” in 10 languages that employ different character sets. The robot was also able to reproduce rough sketches, including one of the Mona Lisa.

Read more

There are about half a dozen other technological approaches to quantum computing vying for preeminence these days. The ion trap method differs from the most popular approach—the silicon chip-based “superconducting qubit”—preferred by the likes of IBM, Google, Intel, and other tech giants. Honeywell, the industrial conglomerate, is one of the few companies pursuing the ion trap approach along with IonQ.

“Quantum computers can potentially solve many of the problems we have today,” Chapman told Fortune on a call. He listed off potential areas of impact, such as drug discovery, energy, logistics, materials science, and A.I. techniques. “How would you not want to be part of that?”

“This is a once-in-a-generation type opportunity,” said Andrew Schoen, a principal at New Enterprise Associates, IonQ’s first backer. “We view this as a chance to build the next Intel.”

Read more

A new research project aims to harness the power of quantum computers to build a new type of neural network — work the researchers say could usher in the next generation of artificial intelligence.

“My colleagues and I instead hope to build the first dedicated neural network computer, using the latest ‘quantum’ technology rather than AI software,” wrote Michael Hartmann, a professor at Heriot-Watt University who’s leading the research, in a new essay for The Conversation. “By combining these two branches of computing, we hope to produce a breakthrough which leads to AI that operates at unprecedented speed, automatically making very complex decisions in a very short time.”

Read more

What if drones and self-driving cars had the tingling “spidey senses” of Spider-Man?

They might actually detect and avoid objects better, says Andres Arrieta, an assistant professor of mechanical engineering at Purdue University, because they would process faster.

Better sensing capabilities would make it possible for drones to navigate in dangerous environments and for cars to prevent accidents caused by human error. Current state-of-the-art sensor technology doesn’t process data fast enough—but nature does.

Read more

Conclusion

As Nvidia CEO Jensen Huang has stated, “Software ate the world, but AI is going to eat software.” Extrapolating this statement to a more immediate implication, AI will first eat healthcare, resulting in dramatic acceleration of longevity research and an amplification of the human healthspan.

Next week, I’ll continue to explore this concept of AI systems in healthcare.


DARPA has awarded funding to six organizations to support the Next-Generation Nonsurgical Neurotechnology (N) program, first announced in March 2018. Battelle Memorial Institute, Carnegie Mellon University, Johns Hopkins University Applied Physics Laboratory, Palo Alto Research Center (PARC), Rice University, and Teledyne Scientific are leading multidisciplinary teams to develop high-resolution, bidirectional brain-machine interfaces for use by able-bodied service members. These wearable interfaces could ultimately enable diverse national security applications such as control of active cyber defense systems and swarms of unmanned aerial vehicles, or teaming with computer systems to multitask during complex missions.

“DARPA is preparing for a future in which a combination of unmanned systems, artificial intelligence, and cyber operations may cause conflicts to play out on timelines that are too short for humans to effectively manage with current technology alone,” said Al Emondi, the N program manager. “By creating a more accessible brain-machine interface that doesn’t require surgery to use, DARPA could deliver tools that allow mission commanders to remain meaningfully involved in dynamic operations that unfold at rapid speed.”

Over the past 18 years, DARPA has demonstrated increasingly sophisticated neurotechnologies that rely on surgically implanted electrodes to interface with the central or peripheral nervous systems. The agency has demonstrated achievements such as neural control of prosthetic limbs and restoration of the sense of touch to the users of those limbs, relief of otherwise intractable neuropsychiatric illnesses such as depression, and improvement of memory formation and recall. Due to the inherent risks of surgery, these technologies have so far been limited to use by volunteers with clinical need.

Read more