Reports about Israel's use of artificial intelligence in its war against Gaza have sparked international and UN outrage. UN Secretary-General António Guterres expressed deep concern over the information, stating that this technology should be employed for good, not for killing. He voiced grave worries that "life and death decisions" could be linked to "calculations made by algorithms."
Israeli intelligence sources revealed to The Guardian that Israeli forces are using a system called "Lavender" in the Gaza conflict. These sources alleged that it has authorized Israelis to kill civilians while pursuing Palestinian fighters.
### Identification of 37,000 Potential Palestinian Targets
Intelligence sources reported that Israeli forces have relied on an AI-supported database that identified 37,000 potential Palestinian targets, creating a database of individuals based on specific algorithms that refine search criteria. The AI system "Lavender" was developed by Unit 8200 (elite intelligence).
The Israeli military described "Lavender" as a database used to "gather various intelligence to produce information at multiple levels about the military operations of (terrorists), and not a confirmed list of names of fighters to target." According to the report, based on testimony from six intelligence officers to journalist Yuval Abraham, the use of AI systems in the war on Gaza has reached an unclear phase after six months of bloody bombardment that has resulted in tens of thousands of Palestinian casualties, raising a host of legal and ethical questions, and altering the relationship between military personnel and machines.
### How the "Lavender" System Works
An Israeli officer who used the "Lavender" system stated that the trust in automated counting is greater than the trust in a "sad soldier," adding, "The machine does the task coldly, making it easier." According to the six officers, "Lavender" recorded around 37,000 Palestinians linked to Hamas or Islamic Jihad through the AI system.
The Israeli military has denied these claims, stating that it "does not use any system to identify suspected targets."
### Use of "Dumb Bombs"
The attacks were carried out using what are referred to as "dumb bombs," which are unguided bombs that completely demolish homes along with their occupants. An intelligence source said: "We won't use smart bombs against an unimportant fighter; they are very expensive, and we have a shortage of them." Military experts indicate that Israel uses dumb bombs to level the homes of thousands of Palestinians with the help of AI, explaining the significant number of war casualties.
### Allowing the Killing of 15 or 20 Civilians
The report stated that an Israeli soldier, during an air raid targeting fighters, was allowed to kill 15 or 20 civilians, or 100 civilians for every leader, by dropping "dumb bombs" that destroy entire homes and kill all their inhabitants. The number of fatalities has reportedly risen to over 33,000 after six months, with injuries surpassing 75,000.
The Guardian previously confirmed that the Israeli military uses an AI program called "Gospel," which is fed data to select "targets" for bombing in Gaza, including armed groups and their leaders.
### The "Gospel" Program
The newspaper noted that a secret military intelligence unit, managed by AI, plays a crucial role in Israel's response to Hamas attacks. Israel is employing numerous powerful AI systems in its war with Hamas, leading to a new chapter in advanced warfare, raising questions about the legality and ethics of using these systems, as well as the changing relationship between military personnel and machines, according to the British newspaper.
Earlier, The New York Times reported that Israeli intelligence and military officers indicated that Israeli forces use a facial recognition program capable of "collecting and indexing images of Palestinian faces," able to identify individuals in seconds. They confirmed that "Israel began using this program since late last year" without making it public, thus gathering and storing images without the knowledge or consent of the Palestinian population. An officer told the newspaper that at times this technology mistakenly classified civilians.