The Israeli army is making heavy use of an artificial intelligence (AI) system that mass-generates assassination targets in Gaza in order to reach certain thresholds of killing of Palestinians every day, a new explosive report finds. This AI generated information is used by the military to kill targets as soon as they step into their homes, all but ensuring “collateral” deaths of non-targets and families.

According to a sprawling investigation by +972 Magazine and Local Call, the AI system, called “Lavender,” has created as many as 37,000 Palestinian targets since October 7, using information like visual characteristics, cellular phone activity, social networks, and more, in order to mark Palestinians as supposed Hamas operatives.

Sources said that the goal of the technology isn’t accuracy, but to automatically generate as many targets as possible for the military to kill, with little to no oversight by humans to determine the legitimacy of the targets. Officers were under pressure by military higher-ups to approve as many targets as possible; if there were days where there were fewer targets, sources said higher-ups would press officers to produce more.

“In a day without targets [whose feature rating was sufficient to authorize a strike], we attacked at a lower threshold. We were constantly being pressured: ‘Bring us more targets.’ They really shouted at us. We finished [killing] our targets very quickly,” one source, identified only as B., told +972 and Local Call.

“One day, totally of my own accord, I added something like 1,200 new targets to the [tracking] system, because the number of attacks [we were conducting] decreased,” said another anonymized source. “That made sense to me. In retrospect, it seems like a serious decision I made. And such decisions were not made at high levels.”

In order to speed up target elimination, soldiers have been ordered to treat Lavender-generated targets as an order, rather than something to be independently checked, the investigation found. Soldiers in charge of manually checking targets spent only seconds on each person, sources said, merely to make sure the target was a man. Children are also considered legitimate targets by Lavender.

There is no procedure to check if someone was targeted in error, the report found. This is despite Lavender being considered within the military to have only a 90 percent accuracy rate. In other words, 10 percent of the people singled out as targets by Lavender — and by extension the families and civilians harmed or killed in the process – are not considered to have any real connection with Hamas militants.

Palestinians marked by Lavender were specifically targeted to be killed at their homes, meaning that they are often killed along with their families and any neighbors that may reside in the same building, in a system known as “Where’s Daddy?”. The number of “collateral” casualties has fluctuated since October 7 but has ranged from five at the lower end to as many as hundreds for even low-level assumed militants, the report found.

“At 5 a.m., [the air force] would come and bomb all the houses that we had marked,” B. told +972 and Local Call. “We took out thousands of people. We didn’t go through them one by one — we put everything into automated systems, and as soon as one of [the marked individuals] was at home, he immediately became a target. We bombed him and his house.”

By contrast, the report cited an interview with a U.S. general who made intelligence decisions during the Iraq war who said that, even for a high-level target like Osama Bin Laden, the number of “acceptable” collateral targets would have been 30 at most, while the typical number for low-level commanders was zero.

Low-level supposed officials residing in apartments with “only” a few floors would often be killed with unguided or “dumb” bombs — the kind that maximize civilian death because of their imprecise nature.

The AI made automated calculations of how many collateral deaths would occur with each strike, but one source said there was “no connection” between the calculations versus reality. Often, homes were bombed when the supposed officer wasn’t even inside.

“The only question was, is it possible to attack the building in terms of collateral damage? Because we usually carried out the attacks with dumb bombs, and that meant literally destroying the whole house on top of its occupants,” one source, identified as C., told +972 and Local Call. “But even if an attack is averted, you don’t care — you immediately move on to the next target. Because of the system, the targets never end. You have another 36,000 waiting.”

If the report’s contents are true, then it provides a horrific explanation for the massive number of civilians and entire families wiped out by Israel’s genocidal assault over the past six months. It suggests that Israel is not fulfilling its obligations under international law — and U.S. laws supposedly in place to prevent use of military aid for humanitarian violations — to minimize civilian casualties.

It also demonstrates, as advocates for Palestinian rights have long said, the motivation behind many of the Israel Defense Forces’ (IDF) actions: to slaughter as many Palestinians as quickly as possible.

Multiple sources told the publications that the implied reasoning behind the heavy use of Lavender was “revenge” for October 7. The Israeli army said that it only uses Lavender as one of its tools of slaughter.

According to the investigation, the idea for the AI system was originally dreamed up by a man who is currently a commander of an Israeli intelligence unit and who wrote in a 2021 book that AI is the solution for ensuring that enough people are targeted for killing each day.

“We [humans] cannot process so much information. It doesn’t matter how many people you have tasked to produce targets during the war — you still cannot produce enough targets per day,” a passage quoted by the article says.

The report says that IDF officers have essentially been allowed or instructed to attack without judgment, as a response to “hysteria in the professional ranks” after October 7.

“No one thought about what to do afterward, when the war is over, or how it will be possible to live in Gaza and what they will do with it,” one source, labeled A., said. “We were told: now we have to fuck up Hamas, no matter what the cost. Whatever you can, you bomb.”

Truthout is widely read among people with lower ­incomes and among young people who are mired in debt. Our site is read at public libraries, among people without internet access of their own. People print out our articles and send them to family members in prison — we receive letters from behind bars regularly thanking us for our coverage. Our stories are emailed and shared around communities, sparking grassroots mobilization.

We’re committed to keeping all Truthout articles free and available to the public. But in order to do that, we need those who can afford to contribute to our work to do so.

We’ll never require you to give, but we can ask you from the bottom of our hearts: Will you donate what you can, so we can continue providing journalism in the service of justice and truth?

Sharon Zhang is a news writer at Truthout covering politics, climate and labor. Before coming to Truthout, Sharon had written stories for Pacific Standard, The New Republic, and more. She has a master’s degree in environmental studies. She can be found on Twitter: @zhang_sharon.

QOSHE - Report: Israeli Army Uses AI to Produce Palestinian Targets for Assassination - Sharon Zhang
menu_open
Columnists Actual . Favourites . Archive
We use cookies to provide some features and experiences in QOSHE

More information  .  Close
Aa Aa Aa
- A +

Report: Israeli Army Uses AI to Produce Palestinian Targets for Assassination

7 27
04.04.2024

The Israeli army is making heavy use of an artificial intelligence (AI) system that mass-generates assassination targets in Gaza in order to reach certain thresholds of killing of Palestinians every day, a new explosive report finds. This AI generated information is used by the military to kill targets as soon as they step into their homes, all but ensuring “collateral” deaths of non-targets and families.

According to a sprawling investigation by 972 Magazine and Local Call, the AI system, called “Lavender,” has created as many as 37,000 Palestinian targets since October 7, using information like visual characteristics, cellular phone activity, social networks, and more, in order to mark Palestinians as supposed Hamas operatives.

Sources said that the goal of the technology isn’t accuracy, but to automatically generate as many targets as possible for the military to kill, with little to no oversight by humans to determine the legitimacy of the targets. Officers were under pressure by military higher-ups to approve as many targets as possible; if there were days where there were fewer targets, sources said higher-ups would press officers to produce more.

“In a day without targets [whose feature rating was sufficient to authorize a strike], we attacked at a lower threshold. We were constantly being pressured: ‘Bring us more targets.’ They really shouted at us. We finished [killing] our targets very quickly,” one source, identified only as B., told 972 and Local Call.

“One day, totally of my own accord, I added something like 1,200 new targets to the [tracking] system, because the number of attacks [we were conducting] decreased,” said another anonymized source. “That made sense to me. In retrospect, it seems like a serious decision I made. And such decisions were not made at high levels.”

In order to speed up target elimination, soldiers........

© Truthout


Get it on Google Play