Toggle light / dark theme

This robot on wheels is seven feet tall, is kitted out with cameras, microphones and sensors, and uses the three “fingers” on its hands to stock supermarket shelves with products such as bottled drinks, cans and rice bowls.


Japan’s convenience stores are turning to robots to solve their labor shortage.

Artificial intelligence (AI) experts at the University of Massachusetts Amherst and the Baylor College of Medicine report that they have successfully addressed what they call a “major, long-standing obstacle to increasing AI capabilities” by drawing inspiration from a human brain memory mechanism known as “replay.”

First author and postdoctoral researcher Gido van de Ven and principal investigator Andreas Tolias at Baylor, with Hava Siegelmann at UMass Amherst, write in Nature Communications that they have developed a new method to protect—” surprisingly efficiently”— from “catastrophic forgetting;” upon learning new lessons, the networks forget what they had learned before.

Siegelmann and colleagues point out that deep are the main drivers behind recent AI advances, but progress is held back by this forgetting.

Artificial intelligence researchers at North Carolina State University have improved the performance of deep neural networks by combining feature normalization and feature attention modules into a single module that they call attentive normalization (AN). The hybrid module improves the accuracy of the system significantly, while using negligible extra computational power.

“Feature normalization is a of training deep neural networks, and feature attention is equally important for helping networks highlight which features learned from raw data are most important for accomplishing a given task,” says Tianfu Wu, corresponding author of a paper on the work and an assistant professor of electrical and computer engineering at NC State. “But they have mostly been treated separately. We found that combining them made them more efficient and effective.”

To test their AN module, the researchers plugged it into four of the most widely used neural architectures: ResNets, DenseNets, MobileNetsV2 and AOGNets. They then tested the networks against two industry standard benchmarks: the ImageNet-1000 classification and the MS-COCO 2017 object detection and instance segmentation benchmark.

There are a variety of complementary observations that could be used in the search for life in extraterrestrial settings. At the molecular scale, patterns in the distribution of organics could provide powerful evidence of a biotic component. In order to observe these molecular biosignatures during spaceflight missions, it is necessary to perform separation science in situ. Microchip electrophoresis (ME) is ideally suited for this task. Although this technique is readily miniaturized and numerous instruments have been developed over the last 3 decades, to date, all lack the automation capabilities needed for future missions of exploration. We have developed a portable, automated, battery-powered, and remotely operated ME instrument coupled to laser-induced fluorescence detection. This system contains all the necessary hardware and software interfaces for end-to-end functionality. Here, we report the first application of the system for amino acid analysis coupled to an extraction unit in order to demonstrate automated sample-to-data operation. The system was remotely operated aboard a rover during a simulated Mars mission in the Atacama Desert, Chile. This is the first demonstration of a fully automated ME analysis of soil samples relevant to planetary exploration. This validation is a critical milestone in the advancement of this technology for future implementation on a spaceflight mission.

We tackle the crucial challenge of fusing different modalities of features for multimodal sentiment analysis. Mainly based on neural networks, existing approaches largely model multimodal interactions in an implicit and hard-to-understand manner. We address this limitation with inspirations from quantum theory, which contains principled methods for modeling complicated interactions and correlations. In our quantum-inspired framework, the word interaction within a single modality and the interaction across modalities are formulated with superposition and entanglement respectively at different stages. The complex-valued neural network implementation of the framework achieves comparable results to state-of-the-art systems on two benchmarking video sentiment analysis datasets. In the meantime, we produce the unimodal and bimodal sentiment directly from the model to interpret the entangled decision.

NASA is going to be testing a new precision landing system designed for use on the tough terrain of the moon and Mars for the first time during an upcoming mission of Blue Origin’s New Shepard reusable suborbital rocket. The “Safe and Precise Landing – Integrated Capabilities Evolution” (SPLICE) system is made up of a number of lasers, an optical camera and a computer to take all the data collected by the sensors and process it using advanced algorithms, and it works by spotting potential hazards, and adjusting landing parameters on the fly to ensure a safe touchdown.

SPLICE will get a real-world test of three of its four primary subsystems during a New Shepard mission to be flown relatively soon. The Jeff Bezos –founded company typically returns its first-stage booster to Earth after making its trip to the very edge of space, but on this test of SPLICE, NASA’s automated landing technology will be operating on board the vehicle the same way they would when approaching the surface of the moon or Mars. The elements tested will include “terrain relative navigation,” Doppler radar and SPLICE’s descent and landing computer, while a fourth major system — lidar-based hazard detection — will be tested on future planned flights.

Currently, NASA already uses automated landing for its robotic exploration craft on the surface of other planets, including the Perseverance rover headed to Mars. But a lot of work goes into selecting a landing zone with a large area of unobstructed ground that’s free of any potential hazards in order to ensure a safe touchdown. Existing systems can make some adjustments, but they’re relatively limited in that regard.

The Army is formally moving ahead with the development and fielding of a powered exoskeleton to help soldiers move faster and carry more while reducing overall fatigue after years of experimentation and testing.

Officials with Army Futures Command are currently in the process of drafting formal requirements for an infantry exoskeleton ahead of a defense industry day sometime in November, said Ted Maciuba, deputy director of the robotic requirements division for Army Futures Command.

Breaking Defense first reported news of the fresh exoskeleton effort.

This artificial spiderweb mimics the elasticity, adhesion, and tensile strength of spiderweb silk and, with the capacity to self-clean and sense objects, can even replicate some spiderweb features that rely on the behavior of spiders themselves.

Read more about the research Science Robotics:
🕸https://fcld.ly/wsnulle
🕸https://fcld.ly/rvgs2ub