On 17 September thousands of pagers held by members of Hizbullah across Lebanon and Syria exploded over the course of an hour, killing twelve people and injuring more than two thousand. The next day hundreds of walkie-talkies exploded, killing at least 25 people and injuring 750. These operations, designed to catch the world’s attention, were the latest example of the deployment by Israel’s military and intelligence services of spectacular high-tech methods. They were intended to send the message that Israel is an omnipotent security state.
The IDF doesn’t always advertise its new capabilities. In April, the Israeli/Palestinian website +972 Magazine and other outlets reported that Israeli intelligence units are using algorithmically generated kill lists to determine targets for missile strikes across the Gaza Strip. ‘I have much more trust in a statistical mechanism,’ one of the soldiers interviewed said. ‘The machine did it coldly.’ Over the past few months, I have spoken to a number of intelligence veterans – some of whom were serving last autumn and others who left the military a decade ago – about the way developments in algorithmic warfare have transformed Israeli military operations.
I met David (names have been changed throughout) in June at a café in West Jerusalem. He had volunteered for reserve duty in the Israeli intelligence corps a few days after the 7 October attacks. Many Israelis are one degree of separation from someone murdered, injured or kidnapped that night; in the week after the attacks, 360,000 reservists were mobilised and thousands more volunteered. ‘I guess I regressed in my thinking,’ David told me. ‘I just knew I could help in this way.’ But as the conflict dragged on, he found the rising body count in Gaza increasingly hard to accept. A few days before our meeting, he and 41 other IDF reservists signed an open letter explaining why they would no longer serve in the current war. David’s period of mandatory military service, which ended in 2020, coincided with the introduction of new technology: wiretaps were giving way to speech-to-text software and exhaustive databases from which operatives could extract social media logs, phone transcripts and private messages. David recalled spending hours monitoring ordinary people who had no connection to militant groups and no desire to cause harm.
The unit he volunteered in after 7 October was a ‘passive surveillance unit … not a targeting unit’, but it was converted into one as soon as the war began. The people his team had been watching in Gaza, a large number of whom had no involvement in Hamas’s military operations, were now seen as viable targets. ‘The commander felt he needed to show some kind of success so our efforts would be supported by military leadership,’ David said. The targeting lists his unit assembled were passed to the air force and used to justify assassination missions.
The IDF has a number of AI-assisted systems. Aviv Kohavi, its chief of staff until early last year, gave an interview about the new technology to Israel’s largest daily paper, Yedioth Ahronoth. He said that ‘each brigade has a sophisticated intelligence apparatus akin to the movie The Matrix,’ and mentioned the establishment of a new Targeting Directorate, ‘powered by AI capabilities’. As +972 reported and sources I’ve spoken to have confirmed, three tools in particular have been widely used across Gaza over the last two years: Lavender, Gospel and Where’s Daddy. Lavender provides a list of people to be approved for assassination. Gospel tries to determine where they live, or where they store weapons and plan military operations. Where’s Daddy sends alerts when the targets enter their family homes, so that the air force knows when to strike. All of these tools rely on machine learning systems to trawl through masses of data from a variety of sources, including drone and satellite reconnaissance, location monitoring, social media scraping and transcripts from phone calls, text messages and encrypted messaging applications. Algorithms determine patterns based on where someone went, at what time, to whom they talked and how often. These systems allow the military to bypass the many intelligence analysts, munitions experts and lawyers who were once required to determine valid targets and authorise attacks. Kohavi b0asted that the new tools are capable of supplying twice as many targets in a day – at least a hundred – as intelligence units used to come up with in a year.
The algorithms have given a veneer of technological precision to a campaign that has caused largely indiscriminate destruction. The open letter David signed didn’t mention the huge number of civilians killed in Gaza, or the millions displaced, or the wider humanitarian catastrophe. Instead, the reservists argued that it was time ‘to invest all our efforts and resources in negotiating a deal that will bring back the hostages and restore the security of the state of Israel’. When we spoke, however, David condemned the army’s operations. ‘The mass bombings were depraved,’ he said, ‘and [the commanders] didn’t justify it politically, in terms of an aim, you know, they just bombed.’
The last time IDF reservists publicly opposed military operations in Gaza was in September 2014, a month after an assault had left more than 2250 dead. That letter was signed by 43 veterans of the elite Intelligence Corps Unit 8200. ‘It is commonly thought that the service in military intelligence is free of moral dilemmas and solely contributes to the reduction of violence and harm to innocent people,’ they wrote. ‘However, our military service has taught us that intelligence is an integral part of Israel’s military occupation over the territories.’ Their work, they said, exposed innocent civilians to surveillance, extortion and death. ‘We cannot continue to serve this system in good conscience, denying the rights of millions of people. Therefore, those among us who are reservists, refuse to take part in the state’s actions against Palestinians.’
A unit which twenty years ago was so small and inconsequential it wasn’t known to the public has now become the largest in the Israeli army, with several thousand personnel. Initially tasked with signals intelligence – tapping phone lines and radio transmissions – the unit’s operations expanded in the early 1990s. For Palestinians, 8200 became synonymous with dragnet policing and lethal aerial warfare. But for many young Israelis, a posting to the unit was an opening to a career, not an ideological commitment.
I spoke to Avi, one of the organisers of the 2014 letter. We met in a park near Israel’s defence headquarters in Tel Aviv, surrounded by armed soldiers ordering espressos and sandwiches from nearby stalls. Avi told me that reports of the IDF approving AI-generated kill lists reminded him of Hannah Arendt’s writing on bureaucratic violence in Eichmann in Jerusalem. ‘The technology always makes it feel like you are disconnected from violence,’ he said. ‘But sitting in an office determining the parameters with which an algorithm can allow civilians to be killed in targeted strikes: that’s the ultimate abdication of responsibility.’
Avi was conscripted into Israeli intelligence at the start of the Second Intifada in 2000, when he was eighteen. He was one of forty that year selected to take a specialised preparation course in Unit 8200. ‘I felt like I had found my place,’ he said. ‘I saw [intelligence] as constrained, rational, all geared towards preventing attacks on civilians.’ He enjoyed the intellectual challenge of the training. They sat in a classroom for sixteen hours a day with few breaks. ‘The slogan of the course was “everything is possible,”’ Avi said. It made him feel important. After a stint in boot camp, where conscripts are taught to shoot assault rifles at cardboard cutouts draped in keffiyehs, he was deployed to an intelligence base.
In the mid-2000s, military chiefs began remaking intelligence units in the image of Silicon Valley start-ups. The press framed service in military intelligence as ‘better than a degree from MIT’, claiming that it prepared young Israelis for success in a global tech economy. Applicants, typically from the middle-class, liberal and Ashkenazi communities who had rallied for an end to the occupation a few years earlier, vied for entry. Preparation began early. Teenagers took coding classes, studied foreign languages and passed the requisite tests. New recruits were rewarded with lectures from billionaire tech moguls and tours of Tel Aviv start-ups.
According to Gal, one of the organisers of the 2014 letter, Unit 8200 ‘always enjoyed a veil of secrecy. It has this glamorous reputation; it’s seen as a nice computer programming job. There’s this glitter of making a lot of money when you’re done.’ The reservists who refused to continue serving in intelligence in 2014 were conscious that much of what they had done in the army – from listening in on private conversations to engineering surveillance databases – was pivotal to the military’s lethal operations. ‘It felt like I had to do something,’ Avi said. ‘We were going in the wrong direction; I could see it.’
The 2014 letter was published just as venture capitalists and technology CEOs were touting big data analytics and machine learning. Israeli military commanders, many of whom went on to advise or lead private technology firms, were quick to see the potential advantages. ‘If commercial organisations are interested in identifying a need that can be met through targeted marketing,’ the head of Shin Bet wrote in 2015, ‘soldiers identify individuals and groups … from a sea of information to improve the intelligence organisations’ collection and attack capabilities.’ Military chiefs consulted with corporate CEOs on how to optimise their killing capabilities.
Israel’s military has long relied on tech companies within and beyond its borders to help wage war; most notably, Elbit and IBM have supplied computing systems to the IDF since the 1960s; Elbit also supplies unmanned vehicles and components in weapons produced by other companies. But the data-driven tech boom of the 2010s led the army to employ the services of civilian firms that were experimenting in mass surveillance and machine learning. The American data analytics firm Palantir opened an office in Tel Aviv and secured contracts with the Ministry of Defence and the IDF. Microsoft, Alphabet and Amazon all have offices in Israel. Start-ups staffed by veterans of intelligence units and funded by venture capital firms, often from the US or the EU, offered advanced surveillance and weapons systems. Among the most prominent were the cyber-espionage firm NSO, the biometric surveillance company Oosto and the hacking firm Cellebrite. Over the last decade, defence officials have claimed that the revolving door between the military and civilian technology firms is key to maintaining Israel’s military edge.
Alon, who was conscripted as an intelligence analyst in the late 2010s, worked at a military base in central Israel, writing reports on the security situation in the West Bank. He had access to troves of data about civilians across the occupied territories. The tools he used mined social media and other telecommunications to rank civilians, many of them minors, according to their potential to carry out ‘lone wolf’ terror attacks. ‘You can search for specific words, or specific people,’ he said when we spoke in Tel Aviv this summer, or ‘you can just browse through a series of results, like automated alerts about [specific] civilians in the West Bank.’
Israel does not have a constitution, but Article Seven of its Basic Law on Human Dignity and Freedom in theory guarantees a right to privacy to all citizens. Since its establishment in 2006, Israel’s Privacy Protection Authority has enforced a number of robust data protection laws at a similar level to EU regulations. But when it comes to Palestinian citizens of Israel, exceptions made in the name of national security mean these policies are always selectively enforced. In the occupied territories, Israel denies those living under military rule even the most nominal privacy protections.
By the late 2010s, as the sociologist Yagil Levy writes in Shooting and Not Crying, published in Hebrew last year, killing had become the principal metric of military efficacy. Operational success was measured by the number of targets generated and the percentage of assassinations carried out. ‘There was this romance with big data,’ Alon recalled. ‘People got rewarded for spearheading projects with buzzwords like “artificial intelligence” in the title.’ Commanders doled out medals to enterprising conscripts eager to help automate intelligence operations. Government officials celebrated Israel’s technological capabilities as proof of its military supremacy. In May 2023, Eyal Zamir, the director general of the Ministry of Defence, boasted that the country was on the verge of becoming an ‘AI superpower’.
All the hype stifled a number of warnings from establishment figures, including Michael Milshtein, head of IDF military intelligence’s Department for Palestinian Affairs until 2018, and Itzhak Brik, a former IDF general. Pouring time and resources into the latest surveillance tools, they argued, was eroding classic intelligence capabilities. These warnings did nothing to prevent the worst security failure in Israel’s history on 7 October. Alon wrote to his former commander when news of the atrocities broke. Two days later he was back in front of a computer on his old base. Like many Israelis, he was outraged at the political and military establishment for ignoring the evidence that Hamas was planning an attack. But within a week, he realised the military was more intent on inflicting revenge than in attaining lasting security. ‘After 7 October, the simplest thing for them was to say, OK, we let the machines do it,’ Alon told me. ‘I should be clear; they wanted to bomb as much as they could and to bomb hundreds of targets each day.’ Whistleblowers told reporters that commanders were given as little as twenty seconds to sign off AI-determined bombings.
For the first few weeks of the war, David was at the base seven days a week, from early morning until late at night. Many of the usual aids to intelligence-gathering were unavailable: mobile networks were down, and people connected to Hamas’s military wing had thrown away their phones and gone underground. ‘Everything around me felt crazy,’ David said. ‘We were trying to find targets like mad … They wanted to show success, so they lowered their standards.’ Soldiers celebrated strikes even when women and children were killed. In the months that followed, killing became routine. Some people in David’s unit spent their mornings working at their tech jobs in Tel Aviv and their afternoons in command rooms. Political reality rarely punctured his work environment. ‘There was, of course, talk of hundreds of civilians being killed to take out top-level Hamas officials,’ he said. ‘Some people were shocked. They understand this is very significant, but they compartmentalise their moral feelings. And some people, they don’t care. They don’t need to care.’ In August the IDF recalled 15,000 personnel who had recently been demobilised. And late in September, a week after the pager attacks in Lebanon, it said it was calling up two reserve brigades ‘for operational missions in the northern arena’.
27 September
Send Letters To:
The Editor
London Review of Books,
28 Little Russell Street
London, WC1A 2HN
letters@lrb.co.uk
Please include name, address, and a telephone number.