Connect with us


Rising concerns over civilian risks from Israel’s AI use in Gaza conflict

Israel’s use of Artificial Intelligence (AI) technology in its military operations at Gaza, particularly in targeting civilian areas, sparks concerns. Reports allege deliberate civilian targeting, raising ethical concerns about the integration of technology in modern warfare.



The conflict between Israel and Hamas in Gaza has entered a new phase, marked by renewed Israeli strikes on 1 December 2023, after a brief seven-day truce. However, adding complexity and concern to this longstanding conflict is Israel’s reported use of Artificial Intelligence (AI) technology in its military operations, particularly in targeting civilian areas.

This development has sparked global outcry over the increase in civilian casualties and raised questions about the ethical implications of AI in warfare. Israel officially states that the resumption of strikes was prompted by Hamas violating the ceasefire agreement, firing artillery into Israeli territory moments before the truce ended.

The toll on the ground has been devastating, with almost 300 Palestinians reported killed. Shockingly, the casualties include a significant percentage of women and children, according to the Palestinian Ministry of Health in Gaza, intensifying concerns about the impact on non-combatant populations.

A joint investigation by Israeli-Palestinian newspaper +972 Magazine and Hebrew publication Local Call has revealed disturbing findings about the Israeli military’s use of AI. The report suggests that the expansion of the Israeli army’s authorization to target non-military sites, coupled with the deployment of an AI system, may have contributed to the initial stages of the conflict’s destructiveness.

Based on conversations with current and former members of Israel’s intelligence community, Palestinian testimonies, and official statements, the investigation raises alarming questions about the intentional targeting of civilian populations. The indiscriminate bombing, the report details, seems to be aimed at creating shock within Palestinian civil society, with the hope of pressuring Hamas.

At the centre of these revelations is an AI system known as ‘Habsora’ or ‘The Gospel.’ Described as enabling a “mass assassination factory,” this AI system reportedly processes massive amounts of data, allowing the Israeli army to prioritize quantity over quality in targeting.

The report alleges that The Gospel generates an astounding 444 bombing targets daily, demonstrating the scale and speed at which AI is integrated into military decision-making.

Disturbingly, it suggests that the Israeli army has files on most potential targets in Gaza, including residential homes, with stipulated expected civilian casualties. This raises ethical questions about the intentional nature of the attacks and the calculated acceptance of collateral damage.

Aviv Kochavi, the former head of the IDF, emphasized the advanced capabilities of the target division powered by AI.

According to Kochavi, the system translates vast amounts of data into attack targets more effectively than humans. The Gospel, activated during the 11-day war with Hamas in May 2021, reportedly generated 100 targets a day.

Formed in 2019, the IDF’s target division has identified over 12,000 targets in Gaza. The Gospel, through rapid and automatic intelligence extraction, produces targeting recommendations to align machine recommendations with human identification.

The automation and speed introduced by AI have significantly expedited the target creation process, raising concerns about the lack of in-depth analysis and the potential for unintended consequences.

Experts and human rights advocates are sceptical about the claimed benefits of AI in reducing civilian harm. The visible impact of the bombardment on Gaza’s landscape challenges assertions of precision and narrowness of force. IDF figures reveal an unprecedented number of targets attacked, raising concerns about the offensive’s scale and intensity.

In response to these allegations, an Israeli military spokesperson stated that the IDF operates to dismantle Hamas’s military and administrative capabilities, emphasizing adherence to international law and precautions to mitigate civilian harm.

Death toll in Gaza conflict

Regarding the 7 October attack by Hamas, Israel’s figures indicate that approximately 1,200 people were killed, and 240 hostages were seized. As of 30 November, 102 hostages have been freed, including 70 Israelis, following a deal that led to Israel releasing 210 Palestinian prisoners.

Hamas’ media office reported on Tuesday that, since the conflict erupted on 7 October, Israel’s military actions in Gaza have resulted in the deaths of at least 16,248 people, including 7,112 children and 4,885 women. Additionally, thousands are missing, feared to be buried under the rubble.

At least nine relatives of CNN producer Ibrahim Dahman, including his uncle and aunt’s families, were killed in an Israeli strike in northern Gaza, CNN reported.

Dahman was informed on Sunday that a strike had hit his aunt’s home in Beit Lahia, resulting in these casualties. His uncle, the uncle’s wife, their daughter, two grandchildren, his aunt, her husband, and their two children were among the deceased.

Additionally, at least two other family members are in critical condition, with more feared to be buried under the rubble.

Share this post via:
Continue Reading
Notify of
Oldest Most Voted
Inline Feedbacks
View all comments

Interesting Israel doesn’t explain how its supposedly usage of AI failed to predict or forestall the 7-Oct Hamas terrorist attack. Much hype (and horror stories) about AI, but remember many final decisions on what to do still ultimately lie with some human(s) on the spot.

Would best salary (plus 20+ years of OJT) means the best leader?
Would best technology means the best outcome?
Best salary and best technology in the hands of a monkey will produce only monkey results, hor.

AI is overhyped. it’s no better than all those drones accidentally hitting friendlies during obama and trumps’s era.

likewise, hamas’ own non-AI rockets also hit “own goal” a lot of times.

Hamas started it all. Perhaps the word ” jihad” already rooted in the “holy” book called Quran for centuries. My opinion this religion wars will never stop till the 2nd coming of Christ written in the Quran and Bible. Unless they willing to remove the word ” jihad” (holy war) from their books which are very unlikely. What do yo think?

HELLO sdasdasdasdasd

Calling an AI which is programmed to kill civilians, “Gospel,” is an insult to Christianity.