‘Life and Death Decisions’ – HRW Slams Israel’s Use of Digital Tools Risks Civilian Harm in Gaza

Israel continues to carry out massacres against Palestinian civilians in Gaza.(Photo: Mahmoud Ajjour, Palestine Chronicle)

By Palestine Chronicle Staff  

“Problems in the design and use of these tools mean that, instead of minimizing civilian harm, the use of these tools could be resulting in the unlawful killing and wounding of civilians.”

The Israeli military’s use of surveillance technologies, artificial intelligence (AI), and other digital tools to help determine targets to attack in the Gaza Strip may be increasing the risk of civilian harm, Human Rights Watch (HRW) has said.

“The Israeli military is using incomplete data, flawed calculations, and tools not fit for purpose to help make life and death decisions in Gaza, which could be increasing civilian harm,” Zach Campbell, senior surveillance researcher at HRW said on Wednesday.

“Problems in the design and use of these tools mean that, instead of minimizing civilian harm, the use of these tools could be resulting in the unlawful killing and wounding of civilians,” he added.

The rights body released a question-and-answer document about the digital tools, saying it raised grave ethical, legal, and humanitarian concerns.

‘Faulty Data’

According to HRW, the Israeli military is using four digital tools in Gaza to estimate the number of civilians in an area before an attack, notify soldiers when to attack, and determine whether a person is a civilian or a combatant, as well as whether a structure is civilian or military.

The organization found that the digital tools appear to rely on faulty data and inexact approximations to inform military actions “in ways that could contravene Israel’s obligations under international humanitarian law, in particular the rules of distinction and precaution.”

The tools, HRW said, “entail ongoing and systematic surveillance of Palestinian residents of Gaza, including data collected prior to the current hostilities, in a manner that is incompatible with international human rights law.”

It also uses Palestinians’ personal data to inform threat predictions, target identification, and machine learning, said the rights body.

The tools include one based on mobile phone tracking to monitor the evacuation of Palestinians from parts of northern Gaza, where the Israeli military ordered the entire population to leave on October 13.

‘The Gospel’

It also includes a tool that generates lists of buildings or other structural targets to be attacked, called “The Gospel”, and a tool that assigns ratings to people in Gaza related to their suspected affiliation with Palestinian resistance groups, called “Lavender,” for purposes of labeling them as military targets.

Also mentioned by HRW is a tool called “Where’s Daddy?”, which purports to determine when a target is in a particular location – often their presumed family home, according to media reports – so they can be attacked there.

From Habsora to Lavender – The Israeli AI ‘Machines Hunting Down People’ In Gaza

Two of these tools, the evacuation monitoring tool and Where’s Daddy?, are apparently used to inform determining targets, troop movement, and other military actions using mobile phone location data.

“Although they have many practical uses in daily life, these tools are not accurate enough to inform military decisions, especially given the massive damage to communications infrastructure in Gaza,” said HRW.

Lavender and the Gospel rely on machine learning to distinguish between military objectives and civilians, and civilian objects.

‘Human Rights Implications’

Machine learning is a type of AI that uses computerized systems that can draw inferences from data and recognize patterns without explicit instructions. Using it to assign suspicion or inform targeting decisions can increase the likelihood of civilian harm, HRW said.

Meet ‘The Gospel’ – How Artificial Intelligence is Used to Kill Palestinians

“The use of flawed technology in any context can have negative human rights implications, but the risks in Gaza couldn’t be higher,” said Campbell.

“The Israeli military’s use of these digital tools to support military decision-making should not be leading to unlawful attacks and grave civilian harm,” he added.

The rights organization warned that the Israeli military should ensure that any use of technology in its operations complies with international humanitarian law, and that no targeting decisions should be made based solely on recommendations by a machine learning tool.

If Israeli forces are acting upon any of these tools’ recommendations or assessments without sufficient scrutiny or additional information – as has been reported – resulting in attacks causing civilian harm, “Israeli forces would be violating the laws of war.”

Committing serious violations of the laws of war, such as indiscriminate attacks on civilians with criminal intent, are war crimes, said HRW.

According to Gaza’s Ministry of Health, 41,020 Palestinians have been killed, and 94,925 wounded in Israel’s ongoing genocide in Gaza starting on October 7.

Moreover, at least 11,000 people are unaccounted for, presumed dead under the rubble of their homes throughout the besieged enclave.

(The Palestine Chronicle)

(The Palestine Chronicle is a registered 501(c)3 organization, thus, all donations are tax deductible.)
Our Vision For Liberation: Engaged Palestinian Leaders & Intellectuals Speak Out

1 Comment

Leave a Reply

Your email address will not be published.


*