‘The Lesson from Gaza Is Clear: When Ai-powered Machines Control Who Lives, Human Rights Die’

By CIVICUS
Jul 21 2025 –  
CIVICUS discusses the military use of artificial intelligence (AI) in Gaza with Dima Samaro, a Palestinian lawyer and researcher, and director of Skyline International for Human Rights, a civil society organisation (CSO) that defends digital freedoms and human rights in the Middle East and North Africa. Dima serves on multiple boards focused on civic space and surveillance issues, including Innovation for Change’s MENA Hub, the Surveillance in the Majority World Network and the VUKA! Solidarity Coalition, and volunteers with Resilience Pathways to help Palestinian CSOs counter Israeli efforts to restrict civic space and manipulate public narratives.

Dima Samaro

Gaza has become a testing ground for AI-powered warfare. Israel deploys systems such as Gospel and Lavender that produce thousands of strike recommendations based on alleged links to Hamas. Meanwhile, facial recognition technology controls aid distribution and tracks displaced civilians. These tools operate without legal oversight or transparency, creating dangerous accountability gaps. As private companies develop and profit from this technology, Gaza exposes the grave dangers of unregulated AI warfare and its potential for normalising automated violence.

What AI tools are being deployed in Gaza?

Israel is using experimental AI systems on an unprecedented scale in Gaza, making real-time life-or-death decisions against a besieged civilian population. The technology strips away humanity from warfare. In Nuseirat refugee camp, residents reported hearing the cries of infants and women before Israeli quadcopters opened fire directly on those who responded.

The surveillance apparatus is equally invasive. During forced evacuations from northern to southern Gaza, civilians undergo invasive facial recognition and biometric scans to pass military checkpoints. AI-equipped ‘smart cameras’ monitor hospitals such as Al-Shifa in real time during raids. Constant biometric scanning leaves people feeling hunted, reducing them to targets and inflicting deep psychological trauma.

The impacts extend beyond surveillance. In Jabalia refugee camp, explosive robots systematically destroy homes and kill civilians, blocking rescue efforts and burying survivors under rubble. United Nations (UN) experts describe these attacks as ‘domicide’ – the deliberate destruction of civilian homes.

Technology no longer just enables violence but also helps automate the genocide. Israel has integrated AI into its military kill chain, using systems such as The Gospel, Lavender and Where’s Daddy to generate kill lists, geolocate targets and assign strikes. Lavender alone reportedly marked over 37,000 Palestinians for assassination based on flawed metadata and biased algorithms. These systems eliminate human oversight, leading to mass civilian casualties under a secretive, unaccountable regime.

Most information about these technologies comes from Israeli whistleblowers and western investigative journalists. In Gaza, over 230 journalists have been killed since October 2023, many deliberately targeted in drone strikes. This has allowed experimental warfare to continue largely hidden from global scrutiny.

How do corporations profit from this technology?

A vast network of companies profits from Gaza’s suffering. Elbit Systems, Israel’s largest arms manufacturer, supplies 85 per cent of Israeli military drones and gear, marketing them as ‘field-tested’ in Gaza. European firms enable the violence: Italy’s Leonardo S.p.A. supplies naval guns with electronic targeting systems, while Greece’s Intracom Defense continues receiving European Union (EU) defence funding despite developing components for Israeli weapons systems.

US tech giants provide the digital infrastructure. Amazon, Google and Microsoft deliver cloud services that have allegedly helped confirm assassination strikes that have killed civilians. Amazon reportedly hosts intelligence on nearly every person in Gaza. Palantir expanded its contract with Israel in early 2024 to provide battlefield systems that identify and target Palestinians.

Most cynically, surveillance also masquerades as humanitarian aid. Firms such as UG Solutions, staffed by former US military personnel, use drones to scan Palestinians at aid distribution sites. This data feeds directly into targeting systems, transforming the search for food into potential death sentences. As of 13 July, the UN reported 875 Palestinians had been killed while trying to access food, 674 of them near sites run by private contractors such as the Gaza Humanitarian Foundation, part of this militarised aid network.

This creates a profit model where Palestinians become variables in a dataset and civilian suffering becomes marketable. Behind the rhetoric of self-defence, corporations turn genocide into lucrative business.

What legal protections exist against military AI?

Virtually none. The UN Educational, Scientific and Cultural Organization’s 2021 AI ethics guidelines and the UN Guiding Principles on Business and Human Rights are voluntary and lack enforcement. The 2024 EU AI Act exempts AI systems, including autonomous drones used in warfare from regulation, which is particularly troubling given the EU’s dual role as both ethical AI standard-setter and major arms supplier to Israel.

Export controls also fail. The Wassenaar Arrangement – an agreement to control the export of arms and goods and technologies with military uses – cannot regulate Israel since it’s not a member, allowing its AI weapons to avoid scrutiny and gain wide export.

This legal vacuum enables powerful states to evade international law, invoking national security to justify AI violence far beyond battlefields. In Gaza, this manifests through forced biometric scans during displacement that serve solely as control tools. Survival depends on surrendering to constant surveillance.

The hypocrisy is stark: Israel recently signed the Council of Europe’s AI and Human Rights Convention while simultaneously using AI for mass surveillance and killing. This highlights how ethical frameworks shaped in the global north fail to address conflict zone realities.

What’s needed for effective accountability?

Current accountability mechanisms are structurally broken. Israeli military leaders blame algorithms despite known error rates, while corporations hide behind trade secrecy. In Gaza, this may constitute war crimes, yet legal tools such as universal jurisdiction are rarely applied.

Soft approaches fail completely. Corporate self-regulation and voluntary oversight assume transparency that doesn’t exist in Gaza. Real accountability requires direct pressure: arms export bans, targeted sanctions, strategic litigation and removing military exemptions from AI laws. We need International Criminal Court investigations targeting Israeli officials and corporate leaders enabling these actions.

Why does this matter globally?

Gaza serves as a warning. AI warfare tested on Palestinians gets exported worldwide. Israeli drones previously used in Gaza are now deployed by Frontex, the EU’s border control agency, to patrol the Mediterranean and intercept, not rescue, migrant boats before they reach European shores. Israeli arms exports hit a record US$ 14.79 billion in 2024 – over half sold to Europe. Weapons used in Gaza today could be used tomorrow in Colombia, Myanmar or Sudan.

As militarised AI becomes normalised, the language of ‘precision’ and ‘efficiency’ masks atrocity. The lesson from Gaza is clear: when AI-powered machines control who lives, human rights die. This transcends Palestine’s tragedy – it foreshadows everyone’s future.

Yet resistance persists despite repression. Journalists and civil society activists continue to document AI warfare and prepare legal actions under constant danger and internet blackouts. We refuse invisibility. While governments debate toothless AI ethics, grassroots organisations, university students and tech workers challenge corporations enabling violence. The No Tech for Apartheid campaign targets companies supporting Israeli surveillance, such as Google.

Gaza reminds us that the fight against automated warfare happens not in UN halls but on the ground, and that it’s both a stand against the algorithmic erasure of Palestinian lives and a broader defence of human rights everywhere.

GET IN TOUCH
Website
Facebook
Twitter
Dima Samaro/Twitter

SEE ALSO
Israel vs Iran: new war begins while Gaza suffering continues CIVICUS Lens 19.Jun.2025
‘Digital platforms amplify the Israeli narrative while systematically silencing Palestinian voices’ CIVICUS Lens | Interview with Dima Samaro 27.Dec.2024
‘AI-powered weapons depersonalise the violence, making it easier for the military to approve more destruction’ CIVICUS Lens | Interview with Sophia Goodfriend 23.Nov.2024

 


!function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0],p=/^http:/.test(d.location)?’http’:’https’;if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src=p+’://platform.twitter.com/widgets.js’;fjs.parentNode.insertBefore(js,fjs);}}(document, ‘script’, ‘twitter-wjs’);  

Leave a Reply

Your email address will not be published. Required fields are marked *

*
*