Reportedly, Israel utilized a database with AI assistance to create kill lists for Gaza during the conflict | Latest news on Israel’s actions in Gaza


Two Israeli media outlets have reported on the Israeli military’s use of an AI-assisted system known as Lavender to identify targets in Gaza. This use of artificial intelligence for targeting has raised concerns among human rights and technology experts, who believe it could potentially lead to war crimes.

According to reports from +972 Magazine and Local Call, the Israeli army has been using an untested AI-powered database to isolate and identify potential bombing targets in Gaza. The system, named Lavender, reportedly has an error rate of about 10 percent, yet the Israeli military has been using it to quickly identify and target Hamas operatives in Gaza.

Critics have called this AI-assisted targeting system a form of genocide, as it has resulted in the deaths of many civilians in Gaza. The Israeli military has defended its use of AI, stating that analysts must independently verify targets to ensure they comply with international law. However, reports indicate that there have been instances where civilian deaths have been deemed acceptable collateral damage in targeting Hamas operatives.

Legal experts, including Professor Toby Walsh, believe that the use of AI targeting may violate international humanitarian law. Sources cited in the reports claim that the Israeli military has authorized the killing of civilians in order to target Hamas operatives, potentially constituting war crimes.

Despite criticism, Israel is reportedly seeking to export this technology to other countries. Journalist Antony Loewenstein warns that countries may admire Israel’s tactics in Gaza and consider using similar AI-assisted systems in their own military operations.


Please enter your comment!
Please enter your name here