Two U.S. companies have partnered to demonstrate a new type of drone intelligence gathering: multiple quadcopters working together with different types of sensor to find, track and follow a target with minimal human supervision.

What makes this impressive is that this is not a laboratory experiment with specially built hardware, but employed drones already in service with by U.S. forces. The efficient, capable AI software requires minimal processing power so this capability is just an upgrade away – and it will get more powerful as new drones and sensors are added to the picture.

“It’s an exciting milestone on the way to great things,” Matt Vogt, Chief Revenue Officer of Palladyne AI, told me.

There has been a lot of talk about swarming drone capability. Now we are starting to see how it works in practice.

Hardware Meets Software

The demonstration involved Pilot AI software from Palladyne AI running on multiple Teal drones from Red Cat carrying out what the makers term “multi-platform, multi-sensor data fusion in real time.” What this means is transferring a huge amount of the workload from the operator to the machines, a necessity when one person is operating multiple drones.

“It’s absolutely imperative that we reduce the cognitive load on a single operator using a six-inch screen with feeds from nine drones,” say Geoff Hitchcock of Red Cat. “It has to be augmented with AI.”

The demonstration involved drones working together to collaboratively locate and follow a vehicle on the ground. They continued to track it even when it disappeared from camera view, following with sensors detecting its radio signal.

This feat took development on several fronts.

One is the “multi-sensor” aspect. The drones have various different types of sensor, which might include video cameras, thermal imaging, radio-frequency detectors and radar. The software stitches these together so give a higher-fidelity picture than any single sensor can provide. “Data fusion” means combining data so that, for example, thermal and visual data is combined to give a positive identification of a specific vehicle type.

“Multi-platform” means the software is running on multiple drones at the same time, so data from all of them is combined. Pilot AI can interface with a drone’s autopilot to maneuver it into position and ensure that it maintains sight of the target, or to get eyes on a target spotted by another drones. Multiple video images from different angles will also make identification more reliable.

The human operator then does not fly individual drones or view multiple feeds to find objects of interest. They see the fused data from all the drone sensors, and software is capable of recogniz9ing and highlighting specific objects . For example, the system can tell the operator that there is a truck with a heavy machinegun on the move and show it to them. The operator’s role is to decide how to act on that information.

“The magic of the software is that it autonomously enables improved situational awareness,” Vogt explains.

Communication By Negotiation

Another key to Pilot AI is using less bandwidth. During standard operations every drone sends streaming video back to the operator. In Iraq and Afghanistan, this resulted in tens of thousands of hours of high-resolution imagery being archived without ever being examined, as there was simply too much for human analysts to handle.

In Pilot AI, the drones only pass data which is required, a technique refined over an extended period.

“Our CTO is truly passionate about our Pilot software,” says Vogt. “He has worked on the algorithms for many years and they’re based on reinforcement learning, sensor fusion, and game theory. If a drone needs information to help it solve problem, it reaches out to the others and requests them to send information”

This allows each of the drones to get the data it needs to build up a complete picture, without flooding the airwaves with redundant or meaningless data, such as video streams of empty landscapes.

Vogt says the communication process has reinforcement learning built into it, so it gradually gets more efficient as the system learns to handle each environment and adapts to changing conditions such as weather and terrain.

Minimal Hardware Required

The demonstration was carried out in Red Cat’s second-generation Teal 2 drones. What makes it really impressive is that no additional hardware was involved. The Pilot AI ran on the drone’s existing hardware without interfering with other functions. This is because the software is optimized to run on minimal processing power

“The demand on the on-board compute is very light compared to others because of the language format,” says Hitchcock. “It takes around a thousand times less computing power than comparable systems.”

Importantly, Pilot AI is platform-agnostic, meaning that it should be able to run on almost any drone. Palladyne are now working on running it on Red Cat’s latest generation, the Arachnid family, and in particular the Black Widow scout drone. However it can run on many other platforms, which might include medium and large reconnaissance drones and even radar-equipped Reapers and other larger platforms. The system might also incorporate other assets like the drone-dropped trail cameras which observe roads in Ukraine. By tackling the hardest challenge first – running on small drones – the developers should have made the rest of the process easy.

This would enable a common picture, in which many drones find, locate, identify and track many targets simultaneously in real time, giving commanders an unrivalled view of the battlefield.

Both Hitchcock and Vogt were U.S. military drone operators in previous lives, working with more basic systems. In those days drones were dumb, little more than radio-controlled aircraft. Hitchcock recalls launching Pointer drones, which were literally held together with giant rubber bands, off the back of a moving truck in the early 2000s. He is struck by the contrast with today’s autonomous, collaborative drones.

“It’s ridiculous how smart technology is getting now,” says Hitchcock. “There is stuff I never thought I would see in my lifetime.”

As the demo shows, the drones, and the software which enables them, already exist. The next stage is putting the pieces together and making the military aware of the current state of the art so they can decide how best to make use of it.

“We will complete integration with Black Widow imminently,” says Vogt. “After that we will be able to go out on the road and demonstrate the full initial capability to our customers”

Share.

Leave A Reply

Exit mobile version