Algorithmic Warfare: The Rise of Autonomous Weapons

Published
December 19, 2025
Category
Emerging Technologies
Word Count
330 words
Voice
liam
Listen to Original Audio
0:00 / 0:00

Full Transcript

Machines with no conscience are making split-second decisions about who lives and who dies. In Gaza, algorithms have generated kill lists of up to 37,000 targets. Autonomous weapons are being deployed in Ukraine and were showcased at a recent military parade in China.

States are racing to integrate these technologies into their arsenals, believing they can maintain control. Autonomous weapons differ from remotely piloted drones, as they make lethal decisions independently.

Once activated, they process sensor data such as facial recognition and heat signatures to identify targets and fire automatically. This capability allows for rapid escalation of conflicts and introduces the potential for lethal mistakes.

The Israeli military's operations in Gaza have demonstrated AI-assisted targeting systems, including Lavender and The Gospel, to identify suspected militants and generate bombing targets. Israeli intelligence has acknowledged a 10 percent error rate, which they have deemed acceptable, allowing for 15 to 20 civilian deaths for every junior militant identified.

This depersonalization of violence creates an accountability void, questioning who is responsible when an algorithm kills the wrong person. The international community has struggled to regulate autonomous weapons, with discussions failing to yield binding regulations since 2013.

Major military powers like India, Israel, Russia, and the USA have systematically blocked regulation proposals. However, a breakthrough occurred in September 2023, when 42 states expressed readiness to move forward with negotiations.

In December 2023, the UN General Assembly adopted Resolution 78/241, with 152 states voting in favor. A year later, Resolution 79/62 mandated discussions on ethical dilemmas and technological risks associated with autonomous weapons.

The Campaign to Stop Killer Robots, a coalition of over 270 civil society groups, has been advocating for the regulation of these systems, proposing a two-tiered approach to control the most dangerous technologies.

The international community has one year to negotiate a treaty to prohibit autonomous weapons that directly target humans or operate without meaningful human control. With autonomous weapons becoming more sophisticated and embedded in military doctrine, regulating them is increasingly urgent.

← Back to All Transcripts