In 2023, Banksy, the most famous unknown artist in the world, set up a new mural in South London (Banksy 2023). Painted over a traffic stop sign, the mural depicts three military unmanned aerial vehicles, or drones (presumably, Predator class). As in most of Banksy’s art, the message is clear: Stop the war. What is striking about the artwork is not that it was a Banksy or even that it was stolen merely hours after it had been put up. Instead, it calls our attention to an unusual subject that has become normalized (Pong 2022) but is not really discussed in public: military and surveillance drones.
The issue of drone warfare has had academics and policymakers of various kins preoccupied for over two decades now (Enemark 2021). Drones are both hailed for their ability to execute targeted ‘precision’ or ‘surgical’ attacks, thus ostensibly minimizing unnecessary deaths in conflict. Others argue that this raises ethical issues. Namely, precision attacks are rarely, if ever, precise (Benjamin 2013, Pitch Interactive 2022). Furthermore, many question if it is legal or ethical to kill using drones (including important campaigns and NGOs such as Stop Killer Robots and Reprieve), largely because they are used outside of sovereign territories to execute extrajudicial kills. Put simply, extrajudicial kills are those where there was no due process involved in sentencing the target. However, we see increasing numbers of drones on our streets, in the air around us and in military deployments. What does all that mean for the ordinary citizen?
Drones are interesting in that they allow states and other actors to project their power in more flexible terms. This suggests that drones act as prostheses for various regimes to influence state and non-state actors inside and outside of their territories. In democratic societies, we increasingly witness calls to include drones to supplant the lack of manpower in police departments.
Like many other countries, including the UK, Türkiye, China, and the US, Israel has recognized the incredible capacity of drones to help improve tactical and even strategic advantage. Israel has used drones in Gaza for over a decade both because they are ideal for operating in narrow urban spaces and because they can loiter in the air for many hours, thus collecting valuable intelligence. What interests me is the way drones are taken from the seemingly distant theatre(s) of war and into civil societies.
The current war in Gaza, which is a subject of an ICJ case where the court found that Israel’s acts ‘could amount to genocide,’ is a good case in point (OHCHR 2024). While the decision on genocide is pending, the terrible crimes committed by the Israeli Defense Forces (IDF) are often executed via the proxy of drones. In this article, I propose that whatever happens with drone use in Gaza (or anywhere else in the world, like in Yemen) has the potential to be operationalized for civilian purposes in the West.
‘Flesh Witnessing’: Drones in Gaza
When we think of war and policing, we usually imagine a personal encounter. In war, this may be fetishized to imagine a heroic encounter between two fighters. This has been the source of inspiration for artists, orators, politicians, and thinkers. Consider, for example, Remarque’s book All Quiet on the Western Front or Spielberg’s movie Saving Private Ryan. These stories rest on something that Israeli scholar Yuval Noah Harari has called ‘flesh-witnessing,’ or first-hand physical experience of trauma. (Harari 2009)
Flesh-witnessing brings to the fore the personal and, as Enemark has called it, ‘heroic’ nature of warfare because it involves a direct and embodied struggle (Enemark 2014). Drones desensitize us to this heroic aspect of warfare. More importantly, drones desensitize us to flesh-witnessing in general. Flesh-witnessing in the drone age is not witnessing combat between two or more people, but witnessing someone being targeted, maimed, and/or killed by an unmanned vehicle. How do drones do that?
Militaries use predictive algorithmic technologies, such as Lavender and Gospel programs in Israel, that help them establish killable targets. In this sense, a drone might surveil some area, help intercept signals from local telecom networks, observe heat signatures, and the data collected by them as well as Israel’s intelligence agencies is fed into algorithmic systems that develop actionable predictions on targets. The information gathered is measured in petabytes of data: videos, images, communications, heat and sound signatures, and coordinates. These programs determine who gets to be killed.
The intensity of drone surveillance takes its toll on civilians living in targeted areas. Many people in Afghanistan have reported suffering from depression due to daily drone overflights during the US War in Afghanistan (2001-2021, see Edney-Browne 2019 and Fisk et al. 2019). In Afghanistan, as in Gaza, Pakistan, and Yemen, people have reported being so used to the sound of drones buzzing that they suffered from ‘anticipatory anxiety,’ suggesting an experienced lack of control ‘over one’s safety’ (Emery and Brunstetter 2015).
Drones can stay in the air for up to 24 hours, loitering above targeted areas. Larger drones (such as Reaper class) can sometimes be heard, their soft humming likened to the sound of ‘mosquitos’ or ‘lawnmowers.’ Smaller drones, flying closer to the ground (such as Skylark), can be heard even more clearly. The constant presence of drones in civilian life affects their psycho-social experience greatly, particularly the children. Emery and Brunstetter (2015) call this phenomenon ‘aerial occupation,’ while others call it ‘sonic warfare.’ In Gaza, Shahd Safi writes that ‘always, always, there is the buzzing,’ emphasizing that ‘drones are not new to Gazans.’ (Safi 2024) Since October 7th, the swarming of the machines has become so intense as to cause ‘headaches, irritability, and insomnia, and can literally drive one mad.’ (Safi 2024)
Algorithms of Death
Drones have a dual role in Gaza. In the first instance, it is data mapping to be processed by algorithmic systems such as Gospel to establish targets, as I discuss below. In the second instance, they replace soldiers on the ground (Wilcox 2015 and 2017). Euro-Med Human Rights Monitor reported recently that quadcopters with a rifle attached to them- very light, fast and agile drone helicopters- are used to murder civilians on the streets of Gaza (Euro-Med Monitor 2024). I recount a particularly morbid case of drone murder they reported further below.
In the ongoing war in the Gaza Strip, Israel has been actively using its Gospel and Lavender systems to rank targets based on the potential civilian death toll they might cause (Pruul 2024). Gospel is so effective that it can generate up to 100 targets per day (Davies et al. 2023). This ‘speed’ often justifies the use of such technologies (Gedeon and Miller 2024). IDF has already said in October 2023 that they had identified 12,000 people as potential targets. Various researchers point out one important thing, however: there is absolutely no way of validating that the data processing and classification done by these automated systems are correct. To be able to classify someone as a threat solely based on data means trusting an unsupervised system to determine someone’s killability. To expedite this, it seems that Israel reduced the ‘kill chain’ process – a process between identifying a target and authorizing a strike – to just about 10 minutes (El-Shewy, Griffiths, and Jones 2024). How would it be possible to verify such intelligence data in 10 minutes?
Israel’s Gospel and Lavender programs are by no means exceptions. It is important to remember that the US military did the same thing with its Gilgamesh and Skynet programs in Pakistan and Afghanistan during the Obama years (2009-2017, see Pugliese 2020). Despite their unreliability, it is fascinating how much military establishments and governments trust algorithmic systems. For example, Gospel and Lavender systems’ decisions were described as trusted by the IDF “as if it were a human” (Iraqi 2024, see also Fisk et al. 2019).
There is no known way to determine if the targets generated pose actual threats, or if the automated systems generated blunders. How someone is classified as a high vs. low threat is also unclear. For example, a case from US warfare in Pakistan saw the US government mark a prominent AlJazeera journalist, Ahmad Zaidan, as a terrorist because he had exchanged messages with members of Al-Qa’ida as part of his job (Fishman and Glenn Greenwald 2015). Israel has indiscriminately killed many journalists using drones, such as in Khan Younis (Gaza) on January 7th, when they killed 2 journalists and their driver after deeming them a ‘threat’ for operating a consumer-type drone to record images of devastation in the area (Loveluck, Piper, and Cahlan 2024).
As indicated earlier, drones are not just used to expedite missions but to also protect military lives by predicting and eliminating threats. Euro-Med Monitor recounted the story of Silah, a woman who wanted to escape the Jabalia refugee camp, the largest refugee camp in Gaza, which is now destroyed. Silah was carrying a white flag and leading a group of civilians out of the camp when she was shot straight in the head and murdered on the spot by an IDF quadcopter. Her family could not bury her body, which had been left lying in the street for 10 days after the murder as the IDF was doing sweeps of the area. How could the IDF have known that Silah was a Hamas terrorist? They could not, but the quadcopter was there to reduce the potential harm to IDF soldiers. Indeed, as Human Rights Watch states, drone murders are precise, but they are often ‘precisely wrong’ (HRW 2009).
I recount this not to simply portray how much suffering wars bring- I find that to be self-evident- but also to understand how and what drones, as technologies of war, do. In desensitizing us to drone violence(s), the operation of which US military’s drone operators referred to as playing a game, (Chamayou 2014, Gregory 2017) or ‘predator porn,’(Phelps 2021) drones allow us to project fantasies of control, power, and arbitration of life and death on their subjects. The surveilled bodies are turned into expendable materials that can, at any given time, be liquidated without clear responsibility.
Using drones in war, however, makes for a very extreme case because they are usually used so far away from Europe. I want to discuss below how such military technologies never stay away from the home turf (e.g., Europe, the US, Türkiye, China). In fact, they are increasingly finding their place in civil societies. (Bousquet 2019, Pong 2022, Gee 2024) How might what is happening in Gaza be relevant to us?
Protesting Today: Militarized Police
Gaza, like many other theatres of war, is also used as a form of an experiment. Unfortunately for Gaza and quite typically for all drone-surveilled areas, the uniqueness of the experiment is such that it is also exercised on civilian bodies. Not all drones are there to kill, but every time authorities use drones, they also effectively train to use them to target better.
These technologies are part and parcel of ‘power-geometries of globalization’ (El-Shewy, Griffiths, and Jones 2024, from Massey 2004, 12), meaning that they are not exclusive to particular geographies. They are exhibited at defense fairs and sold internationally. Israeli Pegasus spyware (Gee 2024), drones, and military AI technologies, for example, find their way across the world (Biddle 2024). As an example, Xtend, an Israeli drone company whose services were used by IDF in the ongoing war in Gaza, has just recently raised USD40 million to develop an ‘AI operating system that allows humans to manage teams of drones and robots for defense and civilian purposes’ (Wrobel 2024; Biddle and Lacy 2024). Moreover, there are NGOs arguing that Xtend was funded by the EU’s Horizon Programme in the amount of EUR 50,000 to fund a study on optimizing its drone systems and commercializing their operations (Askew 2024). The distant war in Gaza then emerges as an immediate issue of human security in the age of (semi)autonomous technologies of war and policing in Europe and the West, and beyond (Tängh 2024).
Overreliance on unmanned systems can be problematic for policing because instead of building trust, communities and social care, they allow the state to project its powers in a way that is hard to control (by militarizing police), even while claiming the interest of public safety. As Zuev and Bratchford (2020) have argued, droning public spaces begs the question of who should be visible, what is made visible, and how remote power allows state structures to target specific individuals (for example, activists) during protests. Drone use is usually justified by claiming that they help predict possible violence. However, the certainty of the predictions can hardly be tested. Regardless, like in militaries, drone use in policing is growing.
Fish and Richardson (2021) tell us that over 1,100 law enforcement agencies in the US owned drones in 2021, and that Predator drones (such as used for military purposes in Afghanistan and Pakistan) were used to surveil Black Lives Matter (BLM) protests in 2020 (see also Biddle 2020; Chavis 2021). Police in London have used drones to police BLM, Extinction Rebellion, HS2 protests (a proposed high-speed railway that is environmentally and economically controversial), and right-wing protests (Dodd 2021). Drones were even used in the UK to monitor people walking in national parks during the COVID-19 pandemic, assessing whether they were doing ‘essential’ or ‘non-essential’ tasks, meaning that they were determining if people should be walking outside during a particularly rigorous period of social distancing restrictions (Dodd 2021). In 2023 in France, the May Day rallies were surveilled by drones. They tracked the movements of crowds to allow the police to predict if there would be vandalism and to stop it right in its tracks (France24 2023). This predictive policing parallels the role of Gospel and Lavender because their purpose is to stop possible terrorist attacks before they develop.
Conclusion
The increase in drone use for various purposes-environmental, humanitarian, security, and military- should not be seen through black and white lenses. It is neither good nor bad. What we must pay attention to are new forms of relations between, for example, the citizen and the government that might endanger people’s rights to life, privacy, protest, and freedom of speech. The knowledge(s) that we have on how drones operate stems directly from algorithms and procedures developed during conflict, usually far away from Europe or the US. There is indeed an ethical dimension to drone use that goes beyond the paradigm of rights and positions us squarely in the scope of justice and appropriateness.
While historically, drone research has focused on military drones in the context of the US empire (Shaw 2016 and Pong 2022), the normalization of drone use across the world (including predictive policing technologies more broadly) calls for a complex and multifaceted engagement with these technologies. Gaza is an apposite example in this context precisely because of its unique position as a strip of land on the shore of the Mediterranean encircled by Israel from all sides. Gaza has been an open-air prison camp since at least 2005, but it is current drone use that makes it an example par excellence of how life can be managed by air occupation. Gaza is therefore a microcosm of drone experimentation with ramifications across the world. Quadcopters, such as the one that killed Silah, are already part of civic life and the public space. Yet it is not just about quadcopters per se but rather assemblages of infinitely connected algorithmic systems that surveil the public sphere in the name of safety every day. Ultimately, we should be attuned to the power of surveillance and predictive policing to affect our daily lives and the exercise of our freedoms.