“This technology is our future threat,” warns Serhiy Beskrestnov as he examines a newly captured Russian drone. Unlike traditional weapons, it uses artificial intelligence to locate and strike targets without human control.
Beskrestnov, a consultant for Ukraine’s defence forces, has analysed countless drones since the war began. This model stands out. It neither sends nor receives signals, making it impossible to jam or detect.
Both Ukrainian and Russian forces now deploy AI on the battlefield. They use it to locate enemy positions, analyse intelligence, and clear mines faster than ever before.
AI becomes Ukraine’s invisible ally
Artificial intelligence has become indispensable for Ukraine’s army. “Our forces receive over 50,000 video streams from the front every month,” says Deputy Defence Minister Yuriy Myronenko. “AI analyses the footage, identifies threats, and plots them for commanders.”
The technology enables faster decision-making, optimises resources, and reduces casualties. Its most profound effect appears in unmanned systems. Ukrainian troops now operate drones that lock onto targets and fly autonomously in the final stage of attacks.
These drones cannot be jammed and are extremely difficult to shoot down. Experts predict they will evolve into fully autonomous weapons capable of identifying and eliminating targets independently.
Drones that operate independently
“All a soldier needs to do is press a button on a smartphone,” explains Yaroslav Azhnyuk, CEO of Ukrainian tech company The Fourth Law. “The drone finds its target, drops explosives, assesses the damage, and returns to base. No piloting skills are required.”
Azhnyuk believes these drones could greatly strengthen Ukraine’s air defences against Russian long-range drones like the Shaheds. “A computer-guided system can outperform humans,” he says. “It reacts faster, sees more clearly, and moves more precisely.”
Myronenko admits fully autonomous systems are still in development but says Ukraine is close. “We have partly integrated it into some devices,” he adds. Azhnyuk predicts thousands of these drones could be operational by the end of 2026.
Progress comes with serious risks
Full automation carries dangers. “AI might not distinguish a Ukrainian soldier from a Russian one,” warns Vadym, a defence engineer who requested anonymity. “Their uniforms often look identical.”
Vadym’s company, DevDroid, produces remotely controlled machine guns that use AI to detect and track people. Automatic firing is disabled to prevent friendly fire. “We could enable it,” he says, “but we need more field experience and feedback before trusting it fully.”
Ethical and legal questions remain. Can AI follow the laws of war? Will it recognise civilians or soldiers who surrender? Myronenko stresses humans must make the final decision, even if AI assists. Yet he warns that not all militaries will act responsibly.
A new global arms race
AI is driving a dangerous new arms race. Traditional defences—jamming, missiles, or tanks—struggle against swarms of intelligent drones.
Ukraine’s “Spider Web” operation last June, when 100 drones struck Russian air bases, reportedly relied on AI coordination. Many fear Moscow could replicate the tactic, both at the front and deep inside Ukraine.
President Volodymyr Zelensky told the United Nations that AI is fuelling “the most destructive arms race in human history.” He called for urgent global rules on AI weapons, warning the threat is “as urgent as preventing the spread of nuclear arms.”
