
Addressing the United Nations General Assembly, President Volodymyr Zelensky delivered a blunt warning: Vladimir Putin “will continue to push the war further and deeper wider” if he is halted. His words placed Russia’s battlefield belligerence in the context of a growing technological arms race, warning that unless there is collective resistance, Moscow’s ambitions will reach far beyond Ukraine’s western frontier. Zelensky’s call was not for the traditional war he called for setting world standards in artificial intelligence in weapons, citing the unprecedented risk presented by autonomous drones and unmanned systems.

1. AI Weaponry as a Strategic Threat
Zelensky’s unease mirrors the truth of war in which drones now claim 70-80% of Russia-Ukraine war casualties. AI-enabled targeting systems have raised strike accuracy to some 80% from 30-50%, allowing for more rapid, more lethal combat. The president of Ukraine cautioned that “weapons decide who survives,” and with good reason, pointing out the risk of placing life-or-death decisions in the hands of machines. This is no futuristic threat autonomous drones already fill the skies with minimal human intervention in contested airspace.

2. Autonomous Drone Technical Development
Newer combat drones are equipped with AI to detect and attack targets in even heavily jammed environments. Innovations such as Ukraine’s “Drone Line” initiative seek to establish a 15–40 km unmanned kill zone, where air reconnaissance and ground operations are integrated to hinder Russian mobilization. Local innovation has reduced costs to pennies; AI targeting modules can be installed for as little as $25, and precision strikes are made economically viable for volunteer forces. Yet, as Twist Robotics’ Viktor Sakharchuk notes, poorly trained operators can be counterproductive to advanced systems, and such is the significance of rigorous testing and feedback in the field.

3. Nuclear Delivery and Deterrent Implications
Outside traditional strikes, AI-drone capability extends to nuclear strategy. Such abilities, experts caution, would enhance nuclear delivery through penetration of missile defenses, radar evasion, and precision targeting of strategic targets. Ongoing overflight and rapid-reaction capabilities enhance second-strike credibility but also the potential for reducing the threshold for nuclear action. Integrating nuclear command and control systems with autonomous unmanned aerial vehicles may blur the distinction between conventional and nuclear war, a situation riddled with escalation risks.

4. Moldova Under Fire of Hybrid Warfare
Zelensky warned that Europe “cannot afford to lose Moldova” to Russian influence. The Kremlin was blamed by pro-EU President Maia Sandu of pouring “hundreds of millions of euros” into destabilization ahead of local elections. Disinformation campaigns, amplified by Moscow-connected networks, mirror the hybrid approach employed in Ukraine combining cyber intrusion, propaganda, and clandestine funding with military pressure. The tech spin is there: autonomous systems and AI-fed propaganda machines are from the same playbook.

5. Infringement of NATO Airspace and Rules of Engagement
Recent Russian incursions into NATO airspace MiG-31s over Estonia, drones over Poland and Romania have provoked demands for stronger responses. While NATO’s rules of engagement are secret, they are based on allied consensus and prudent threat estimation. Some nations, such as Poland and the Czech Republic, believe aircraft that enter intruding should be destroyed. The risk of escalation is compounded by uncertainty about U.S. policy under President Trump, despite his recent statement that Ukraine could “WIN all of Ukraine back in its original form.”

6. Counter-Drone Measures for Strategic Assets
The high adoption of unmanned systems has necessitated the defense of valuable infrastructure, such as nuclear facilities. The United States National Nuclear Security Administration has also used counter-UAS tools such as Anduril’s Anvil, which can physically destroy adversary drones. The technology comes with command-and-control software that detects, tracks, and neutralizes threats in real-time. These countermeasures are important since adversaries aim drone attacks at high-value strategic assets.

7. International Regulation of Lethal Autonomous Weapons
There is traction for a treaty on Lethal Autonomous Weapons Systems (LAWS). A UN resolution banning fully autonomous human targeting and constraining operational scope was passed in December 2024 by 166 votes. Typologies already distinguish between semi-autonomous, supervised autonomous, and fully autonomous systems, with the two former being LAWS. Classic examples are Israel’s HARPY, Russia’s Lancet-3, and Türkiye’s KARGU capabilities that can autonomously choose and attack targets on their own without human input.

8. Ethical and Research Implications
Researchers like Harvard’s Kanaka Rajan caution that AI weapons have the potential to undervalue the human element in war and make it politically simpler to wage a war. Militarization of AI has the potential to hijack civilian research, and export controls akin to those put in place for nuclear physics during the Cold War may be enacted. Rajan believes there should be university-level controls to handle defense-sponsored projects to ensure that ethical limits are preserved. The issue is how to balance technological progress with protecting against abuse.

9. The Arms Race in Algorithms
Ukraine and Russia are not only waging a war of arms, but of code. Whoever achieves a better “hive mind” of networked autonomous systems for coordinated action can control battlefields in the future. But as explained by Dan Skinner, a retired Royal Australian Infantry officer, low-tech countermeasures can still triumph over high-tech superiority. Operation Spiderweb, Ukraine’s June 2025 drone attack on Russian bomber squadrons, displayed the potential and unpredictability of AI-facilitated warfare.
The intersection of geopolitical desire, independent weapons, and AI-driven decision-making is revolutionizing the rules of war. Zelensky’s warning is as much a matter of holding the ground as it is about holding the faith that humans, not software, ought to determine the fate of nations.