Toggle light / dark theme

“As several people mention in the replies to LenKusov, shooting or otherwise damaging that hefty lithium battery pack could make it explode—which is either very bad if you’re close-range, or exactly what you want if you’re somehow hitting it from a distance and trying for fireworks.”


It turns out that a flip through Spot’s user manual reveals its weaknesses.

There’s more AI news out there than anyone can possibly keep up with. But you can stay tolerably up to date on the most interesting developments with this column, which collects AI and machine learning advancements from around the world and explains why they might be important to tech, startups or civilization.

To begin on a lighthearted note: The ways researchers find to apply machine learning to the arts are always interesting — though not always practical. A team from the University of Washington wanted to see if a computer vision system could learn to tell what is being played on a piano just from an overhead view of the keys and the player’s hands.

Audeo, the system trained by Eli Shlizerman, Kun Su and Xiulong Liu, watches video of piano playing and first extracts a piano-roll-like simple sequence of key presses. Then it adds expression in the form of length and strength of the presses, and lastly polishes it up for input into a MIDI synthesizer for output. The results are a little loose but definitely recognizable.

Today, machine learning permeates everyday life, with millions of users every day unlocking their phones through facial recognition or passing through AI-enabled automated security checks at airports and train stations. These tasks are possible thanks to sensors that collect optical information and feed it to a neural network in a computer.

Scientists in China have presented a new nanoscale AI trained to perform unpowered all-optical inference at the speed of light for enhanced authentication solutions. Combining smart optical devices with imaging sensors, the system performs complex functions easily, achieving a neural density equal to 1/400th that of the human brain and a more than 10 orders of magnitude higher than electronic processors.

Imagine empowering the sensors in everyday devices to perform artificial intelligence functions without a computer—as simply as putting glasses on them. The integrated holographic perceptrons developed by the research team at University of Shanghai for Science and Technology led by Professor Min Gu, a foreign member of the Chinese Academy of Engineering, can make that a reality. In the future, its neural density is expected to be 10 times that of human brain.

“” We’re looking at Flippy as a tool that helps us increase speed of service and frees team members up to focus more on other areas we want to concentrate on, whether that’s order accuracy or how we’re handling delivery partner drivers and getting them what they need when they come through the door.”, said White Castle’s Vice President, Jamie Richardson.”


Flippy is the world’s first autonomous robotic kitchen assistant that can learn from its surroundings and acquire new skills over time.

NEW: A microchip carrying more than 27,000 Civil Air Patrol names with related messages and images is set to be carried to the moon later this year aboard space robotics company Astrobotic’s Peregrine lunar lander. https://www.cap.news/next-stop-the-moon-for-27000-cap-names/


A microchip carrying more than 27000 Civil Air Patrol names with related messages and images is set to be carried to the moon later this year aboard space robotics company Astrobotic’s Peregrine lunar lander.

A silicone robot has survived a journey to 10900 metres below the ocean’s surface in the Mariana trench, where the crushing pressure can implode all but the strongest enclosures. This device could lead to lighter and more nimble submersible designs.

A team led by Guorui Li at Zhejiang University in China based the robot’s design on snailfish, which have relatively delicate, soft bodies and are among the deepest-living fish. They have been observed swimming at depths of more than 8000 metres.

The submersible robot looks a bit like a manta ray and is 22 centimetres long and 28 centimetres in wingspan. It is made of silicone rubber with electronic components spread throughout the body and connected by wires, rather than mounted on a circuit board like most submersibles. That’s because the team found in tests that the connections between components on rigid circuit boards were a weak point when placed under high pressure.