The Israeli military has reportedly utilized artificial intelligence to assist in pinpointing bombing targets in Gaza, as per an investigation conducted by +972 Magazine and Local Call. Six Israeli intelligence officials, who were part of the alleged program, claimed that the human review of the suggested targets was minimal. The AI tool, known as “Lavender,” was said to have a 10% error rate, according to the officials cited in the comprehensive investigation by the online publication jointly managed by Palestinians and Israelis.
In response to the report by +972 Magazine, the Israel Defence Forces (IDF) did not challenge the existence of the tool but refuted the use of AI in identifying suspected terrorists. The IDF stressed in a detailed statement that “information systems are simply tools for analysts in the target identification process,” and that Israel aims to “minimize harm to civilians as much as possible in the operational circumstances prevailing at the time of the strike.”
The IDF stated that “analysts must carry out independent examinations, ensuring that the identified targets align with the relevant definitions in accordance with international law and additional restrictions outlined in the IDF directives.”

Nonetheless, one official disclosed to +972 that human personnel often merely acted as a “rubber stamp” for the machine’s decisions and typically spent only about 20 seconds on each target – confirming their gender – before approving a bombing.
Amid growing international scrutiny of Israel’s military campaign, an investigation has been launched. This comes after targeted airstrikes resulted in the deaths of foreign aid workers who were delivering food in the Palestinian enclave. The Gaza Ministry of Health reports that Israel’s siege of Gaza has claimed the lives of over 32,916 individuals, exacerbating a dire humanitarian crisis. A United Nations-backed report reveals that nearly three-quarters of the population in northern Gaza are experiencing catastrophic levels of hunger.
Yuval Abraham, the author of the investigation, previously spoke to CNN in January about his research on the Israeli military’s heavy reliance on artificial intelligence to identify targets for assassinations, with minimal human oversight.
In response to these claims, the Israeli Defense Forces (IDF) stated that they do not employ an artificial intelligence system to identify terrorists or predict a person’s terrorist affiliation. However, their analysts utilize a database to cross-reference intelligence sources and gather updated information on military operatives belonging to terrorist organizations.
According to the IDF statement, human officers are responsible for verifying whether the identified targets align with the relevant definitions outlined in international law and the IDF directives. This verification process is also described by +972.
Further stated that the Israeli military carried out deliberate assaults on targets within residential areas, often under the cover of darkness when entire families were present.
According to the sources, this resulted in the devastating loss of thousands of Palestinian lives, predominantly women, children, and individuals who were not involved in the conflict. These casualties were primarily caused by Israeli airstrikes, particularly in the initial weeks of the war, due to decisions made by the AI program.
The report, based on information from insiders, revealed that when targeting alleged junior militants, the army showed a preference for utilizing what are known as dumb bombs – unguided missiles that have the potential to cause extensive destruction.
In December, it was revealed that approximately 50% of the 29,000 munitions dropped on Gaza during the previous autumn were non-guided bombs. These conventional bombs can potentially endanger civilians to a greater extent, particularly in densely populated areas such as Gaza.
The IDF statement emphasizes that strikes are only conducted when the expected collateral damage is deemed proportionate to the military advantage, and every effort is made to minimize harm to civilians in the given operational circumstances.
Furthermore, it states that targets are carefully reviewed before strikes, and the appropriate munition is selected based on operational and humanitarian factors. This includes considerations such as the structural and geographical features of the target, the surrounding environment, potential impact on nearby civilians, critical infrastructure in the vicinity, and other relevant aspects.