Toggle light / dark theme

The project is a part of a much wider effort to bring artificial intelligence into the operating room. Using many of the same technologies that underpin self-driving cars, autonomous drones and warehouse robots, researchers are working to automate surgical robots too. These methods are still a long way from everyday use, but progress is accelerating.


Real scalpels, artificial intelligence — what could go wrong?

Leading industrial companies are using artificial intelligence to analyze data from their manufacturing tracking systems to spot the causes of potential defects in real-time.

Robert Bosch GmbH is one of the latest to deploy AI to analyze data from its manufacturing execution systems, as the monitoring and tracking systems are called. General Electric Co. and Siemens AG have already deployed such systems.

The head scientist of the US Space Force has an unusual idea for how to maintain military dominance: augmenting and upgrading human soldiers.

Speaking at an Air Force Research Laboratory event, Space Force chief scientist Joel Mozer suggested that we’re entering an era during which soldiers can become a “superhuman workforce,” according to Metro, thanks to new tech including augmented and virtual reality, sophisticated AI, and nerve stimulation.

“In the last century, Western civilization transformed from an industrial-based society to an information-based society,” Mozer said, “but today we’re on the brink of a new age: the age of human augmentation.”

A team of scientists from the Max Planck Institute for Intelligent Systems (MPI-IS) have developed a system with which they can fabricate miniature robots building block by building block, which function exactly as required.

As one would do with a Lego system, the scientists can randomly combine individual components. The blocks or voxels—which could be described as 3D pixels—are made of different materials: from basic matrix materials that hold up the construction to magnetic components enabling the control of the soft machine. “You can put the individual soft parts together in any way you wish, with no limitations on what you can achieve. In this way, each has an individual magnetisation profile,” says Jiachen Zhang. Together with Ziyu Ren and Wenqi Hu he is first author of the paper entitled “Voxelated three-dimensional miniature magnetic soft machines via multimaterial heterogeneous assembly.” The paper was published in Science Robotics on April 28, 2021.

The project is the result of many previous projects conducted in the Physical Intelligence Department at MPI-IS. For many years, scientists there have been working on magnetically controlled robots for wireless medical device applications at the small scale, from millimeters down to micrometers size. While the state-of-the-art designs they have developed to date have attracted attention around the world, they were limited by the single material with which they were made, which constrained their functionality.

Lauded for years as the system able to best prevent malware infection, macOS recently fell victim to an operating system vulnerability that hackers used to circumvent all of Apple’s system defenses.

Security researcher Cedric Owens discovered this bug in March 2021 while assessing Apple’s Gatekeeper mechanism, a safeguard that will only allow developers to run their on Macs after registering with Apple and paying a fee. Moreover, the company requires that all applications undergo an automated vetting process to further protect against malicious software.

Unfortunately, Owens uncovered a logic flaw in the macOS itself, rather than the . The bug allowed attackers to develop able to deceive the operating system into running their malware regardless of whether they passed Apple’s safety checks. Indeed, this flaw resembles a door that has been securely locked and bolted but still has a small pet door at the bottom through which you can break in or insert a bomb.

The field of soft robotics has exploded in the past decade, as ever more researchers seek to make real the potential of these pliant, flexible automata in a variety of realms, including search and rescue, exploration and medicine.

For all the excitement surrounding these new machines, however, UC Santa Barbara mechanical engineering professor Elliot Hawkes wants to ensure that research is more than just a flash in the pan. “Some new, rapidly growing fields never take root, while others become thriving disciplines,” Hawkes said.

To help guarantee the longevity of soft robotics research, Hawkes, whose own robots have garnered interest for their bioinspired and novel locomotion and for the new possibilities they present, offers an approach that moves the field forward. His viewpoint, written with colleagues Carmel Majidi from Carnegie Mellon University and Michael T. Tolley of UC San Diego, is published in the journal Science Robotics.

An animal scientist with Wageningen University & Research in the Netherlands has created an artificial-intelligence-based application that can gauge the emotional state of farm animals based on photographs taken with a smartphone. In his paper uploaded to the bioRxiv preprint server, Suresh Neethirajan describes his app and how well it worked when tested.

Prior research and anecdotal evidence has shown that are more productive when they are not living under stressful conditions. This has led to changes in , such as shielding cows’ eyes from the spike that is used to kill them prior to slaughter to prevent stress hormones from entering the meat. More recent research has suggested that it may not be enough to shield from stressful situations—adapting their environment to promote peacefulness or even playfulness can produce desired results, as well. Happy cows or goats, for example, are likely to produce more milk than those that are bored. But as Neethirajan notes, the emotional state of an animal can be quite subjective, leading to incorrect conclusions. To address this problem, he adapted human face recognition software for use in detecting emotions in cows and pigs.

The system is called WUR Wolf and is based on several pieces of technology: the YOLO Object Detection System, the YOLOv4 that works with a convolution and Faster R-CNN, which also allows for detection of objects, but does so with different feature sets. For training, he used the Nvidia GeForece GTX 1080 Ti GRP running on a CUDA 9.0 computer. The data consisted of thousands of images of cows and pigs taken with a smartphone from six farms located in several countries with associated classification labels indicating which could be associated with which mood—raised ears on a cow, for example, generally indicate the animal is excited.

Training neural networks to perform tasks, such as recognizing images or navigating self-driving cars, could one day require less computing power and hardware thanks to a new artificial neuron device developed by researchers at the University of California San Diego. The device can run neural network computations using 100 to 1000 times less energy and area than existing CMOS-based hardware.

Researchers report their work in a paper published recently in Nature Nanotechnology.

Neural networks are a series of connected layers of artificial neurons, where the output of one layer provides the input to the next. Generating that input is done by applying a mathematical calculation called a non-linear activation function. This is a critical part of running a neural network. But applying this function requires a lot of computing power and circuitry because it involves transferring data back and forth between two separate units – the memory and an external processor.

The UK government on Wednesday became the first country to announce it will regulate the use of self-driving vehicles at slow speeds on motorways, with the first such cars possibly appearing on public roads as soon as this year.

Britain’s transport ministry said it was working on specific wording to update the country’s highway code for the safe use of self-driving vehicle systems, starting with Automated Lane Keeping Systems (ALKS) — which use sensors and software to keep cars within a lane, allowing them to accelerate and brake without driver input.

The government said the use of ALKS would be restricted to motorways, at speeds under 37 miles (60 km) per hour.