A U.S. Army engineer’s idea to turn the standard M4 rifle into an electromagnetic pulse gun recently got the nod from the U.S. Patent and Trademark Office.
James E. Burke, electronics engineer at the U.S. Army’s Armament Research, Development and Engineering Center, received U.S. patent 10,180,309 on Tuesday, giving the Army intellectual property protections on Burke’s “Electromagnetic Pulse Transmitter Muzzle Adapter.”
This invention would enable a single soldier in a ground unit to destroy enemy electronics, such as small drones or improvised explosive devices, by attaching a special blank-firing adapter to their rifle’s muzzle, then firing a shot.
The development of utility fog just took a significant step forward. The projected size for miniaturization is mm size. With increased nanofabrication should come sub-millimeter.
Absolutely no moving parts, either.
A drone powered by electrohydrodynamic thrust is the smallest flying robot ever made.
The U.S. Army has placed a $39 million order for tiny reconnaissance drones, small enough to fit in a soldier’s pocket or palm.
The idea behind the drones, which are made by FLIR Systems and look like tiny menacing helicopters, is that soldiers will be able to send them into the sky of the battlefield in order to get a “lethal edge” during combat, according to Business Insider.
A saying from one of my favorite movies is, “Tie two birds together and even though they have four wings they cannot fly.” Can’t say the same about flying drones.
“We perform outdoor autonomous flying experiment of f-LASDRA, constructed with multiple ODAR-8 links connected via cable with each other. Each ODAR-8 can compensate for its own weight, rendering f-LASDRA scalable. Utilizing SCKF with IMU/GNSS-module on each link and inter-link kinematic-constraints, we attain estimation accuracy suitable for stable control (5cm: cf. 1-5m w/ GNSS).”
We perform outdoor autonomous flying experiment of f-LASDRA (flying Large-size Aerial Skeleton with Distributed Rotor Actuation), which is constructed with multiple ODAR-8 links (https://youtu.be/S3i9NspWtr0), connected via flexible cable with each other. Each ODAR-8 link can generate omni-directional force/torque and also compensate for its own weight, thereby, rendering the f-LASDRA scalable w.r.the number of links.
Utilizing SCKF with standard IMU/GNSS-module on each link and inter-link kinematic-constraints, we can significantly improve position/attitude estimation accuracy of the f-LASDRA necessary for stable control (less than 5cm) as compared to typical accuracy of GNSS (1-5m). Semi-distributed version of the estimation framework is also devised to address the issue of scalability. (Accepted ICRA 2019). See also https://youtu.be/oHkjB8XzxIg for operational-LASDRA).
The world has not entered the age of the killer robot, at least not yet. Today’s autonomous weapons are mostly static systems to shoot down incoming threats in self-defence, or missiles fired into narrowly defined areas. Almost all still have humans “in the loop” (eg, remotely pulling the trigger for a drone strike) or “on the loop” (ie, able to oversee and countermand an action). But tomorrow’s weapons will be able to travel farther from their human operators, move from one place to another and attack a wider range of targets with humans “out of the loop” (see article). Will they make war even more horrible? Will they threaten civilisation itself? It is time for states to think harder about how to control them.
A good approach is a Franco-German proposal that countries should share more information on how they assess new weapons; allow others to observe demonstrations of new systems; and agree on a code of conduct for their development and use. This will not end the horrors of war, or even halt autonomous weapons. But it is a realistic and sensible way forward. As weapons get cleverer, humans must keep up.
NAIROBI — Countries must agree strict rules on “killer robots” — autonomous weapons which can assassinate without human involvement, a top Red Cross official has said, amid growing ethical concerns over their use in future wars.
Semi-autonomous weapons systems from drones to tanks have for decades been used to eliminate targets in modern day warfare — but they all have human control behind them.
With rapid advancements in artificial intelligence, there are fears among humanitarians over its use to develop machines which can independently make the decision about who to kill.