OP-ED: When the algorithm strikes first
Asymmetric warfare redefines global military strategies
In December 2023, the Nigerian military launched an airstrike on Tudun Biri village in Kaduna State. Officials called it a targeted operation against insurgents. The reality, reconstructed from satellite imagery and survivor testimonies: 85 civilians dead, primarily women and children gathered for a religious festival.
No warning was issued. No explanation followed. No one was held accountable.
Tudun Biri was a system working exactly as it has been allowed to persist: unguided, unsupervised, and ungoverned.
War, upgraded
Nigeria has been in armed conflict for over fifteen years. Boko Haram, then the Islamic State West Africa Province (ISWAP), have displaced millions and exhausted one of the continent’s most formidable militaries. In response, the Armed Forces have turned, as militaries everywhere are turning, to technology.
Unmanned aerial vehicles (UAVs), AI-enabled surveillance, and automated targeting tools are no longer speculative. They are actively deployed in Nigerian skies, over Nigerian villages. In a theatre as geographically vast as this, the case for these tools is real: faster logistics, sharper intelligence, reduced risk to soldiers.
The technology works. What surrounds it does not.
Academic inquiry
My ongoing research, conducted with colleagues across three African conflict states, examines the governance architecture that is supposed to regulate how these systems are acquired, tested, deployed, and reviewed. What we found in Nigeria was, in most meaningful respects, the absence of a governance framework entirely.
Nigeria’s Public Procurement Act of 2007 contains exemptions for “special goods,” a provision that has been systematically applied to high-value defence acquisitions, including the drones and autonomous systems now deployed in conflict zones. Defence contracts, technical specifications, and rules of engagement remain classified. Parliamentary committees struggle to scrutinise them. Civil society cannot audit them. Independent investigators must piece together what happened from satellite imagery and survivor accounts after the fact.
This is a deliberate structural choice — one that transfers decision-making authority over life-and-death systems from accountable institutions to opaque procurement processes.
When AI targeting systems ingest flawed intelligence or corrupted data, there is no human-in-the-loop mechanism to prevent tragedy. In private, military planners concede that current drone operations lack the rigorous target verification protocols standard in NATO-aligned forces.
I define this condition as an “accountability desert” — a state where high-consequence AI systems operate without the legal, ethical, or institutional infrastructure necessary for accountability.
State silence
The state’s response to Tudun Biri was telling. It rebuilt homes. It promised a hospital. It established skills centres. But it did not deliver accountability. No individual commander was held responsible for what officials quietly termed “command confusion.” No independent audit of the targeting failure was conducted.
This absence has fostered a dangerous resignation among affected communities. Technological errors are increasingly accepted as inevitable: acts of God rather than failures of governance. The state has been allowed to substitute charity for justice.
As Hamza Suleiman, an award-winning Nigerian conflict reporter who has covered the insurgency for over a decade, observes: effective drone operations require approximately USD 1 million in specialised training per air officer, for flight skills, intelligence analysis, mapping, geolocation, and target identification.
The Nigerian military’s current drone units lack anything approaching this level of investment. The result is a dangerous gap between operational hardware and human expertise, one that insurgents, often trained by foreign fighters, are quick to exploit.
Some of the autonomous weapons found on African soil — including Iranian Mohajer-6 combat UAVs — are sourced directly from Iran’s supply chain. The same class of systems currently threatening the Gulf’s most sophisticated defence architectures is already present on the continent. In some cases, non-state actors now field more capable systems than the governments attempting to contain them.
Reform
Closing the accountability desert requires more than localised reform. National policy must be brought into line with continental governance frameworks.
Nigeria signed the 2024 Responsible AI in Military Domain (REAIM) Blueprint for Action, alongside 59 other nations, committing to meaningful human control and transparent procurement. South Africa participates in the Global Commission on Responsible AI in the Military Domain (GC REAIM), a Netherlands-led initiative promoting the responsible and lawful deployment of such systems. The commitments exist on paper. The gap between those commitments and what is happening on the ground, in Nigerian skies, over Nigerian villages, is in fact a void.
Closing it requires specific, concrete action: direct access for elected officials to review classified AI defence programmes; independent third-party auditing of AI decision-making processes before and after deployment; and the codification of AI-specific rules of engagement within military doctrine that mandate human override, ensuring that personnel retain the authority to intervene when automated systems fail.
South Africa has the diplomatic standing within the African Union to champion continent-wide governance standards and prevent Africa’s co-option as a testing ground for unregulated foreign military AI. The institutional capacity for these reforms exists. What is missing is the political will to apply them to the most powerful and consequential domain of all.
The families of Tudun Biri accepted the new houses. But they, and millions of other Nigerians living under the watch of systems that cannot distinguish a religious festival from an insurgent gathering, deserve more than reconstruction.
They deserve the assurance that when a drone is deployed, it is guided by a system constrained by law, overseen by independent auditors, and operated by professionals who know that a mistake will cost more than a budget line for rebuilding.
They deserve a security architecture where the law, and indeed basic human decency, has the final word.
A shorter version of this argument was published in Business Day (South Africa) on 11 March 2026, examining what the Iran–Gulf drone strikes reveal about autonomous weapons governance on the African continent. This is the longer treatment.




