Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza

In 2021, a book titled “The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World” was released in English under the pen name “Brigadier General Y.S.” In it, the author — a man who we confirmed to be the current commander of the elite Israeli intelligence unit 8200 — makes the case for designing a special machine that could rapidly process massive amounts of data to generate thousands of potential “targets” for military strikes in the heat of a war. Such technology, he writes, would resolve what he described as a “human bottleneck for both locating the new targets and decision-making to approve the targets.”

Such a machine, it turns out, actually exists. A new investigation by +972 Magazine and Local Call reveals that the Israeli army has developed an artificial intelligence-based program known as “Lavender,” unveiled here for the first time. According to six Israeli intelligence officers, who have all served in the army during the current war on the Gaza Strip and had first-hand involvement with the use of AI to generate targets for assassination, Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war. In fact, according to the sources, its influence on the military’s operations was such that they essentially treated the outputs of the AI machine “as if it were a human decision.”

delirious_owl ,
@delirious_owl@discuss.online avatar

Looks like the article is not accessible on Tor. Here's as much of the article I can paste before reaching the max char limit of Lemmy

‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza

The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.

By Yuval Abraham | April 3, 2024

In 2021, a
book
titled "The Human-Machine Team: How to Create Synergy Between Human and
Artificial Intelligence That Will Revolutionize Our World" was released
in English under the pen name "Brigadier General Y.S." In it, the author
--- a man who we confirmed to be the current commander of the elite
Israeli intelligence unit 8200 --- makes the case for designing a
special machine that could rapidly process massive amounts of data to
generate thousands of potential "targets" for military strikes in the
heat of a war. Such technology, he writes, would resolve what he
described as a "human bottleneck for both locating the new targets and
decision-making to approve the targets."

Such a machine, it turns out, actually exists. A new investigation by
+972 Magazine and Local Call reveals that the Israeli army has developed
an artificial intelligence-based program known as "Lavender," unveiled
here for the first time. According to six Israeli intelligence officers,
who have all served in the army during the current war on the Gaza Strip
and had first-hand involvement with the use of AI to generate targets
for assassination, Lavender has played a central role in the
unprecedented bombing of Palestinians, especially during the early
stages of the war. In fact, according to the sources, its influence on
the military's operations was such that they essentially treated the
outputs of the AI machine "as if it were a human
decision."

Formally, the Lavender system is designed to mark all suspected
operatives in the military wings of Hamas and Palestinian Islamic Jihad
(PIJ), including low-ranking ones, as potential bombing targets. The
sources told +972 and Local Call that, during the first weeks of the
war, the army almost completely relied on Lavender, which clocked as
many as 37,000 Palestinians as suspected militants --- and their homes
--- for possible air strikes.

During the early stages of the war, the army gave sweeping approval for
officers to adopt Lavender's kill lists, with no requirement to
thoroughly check why the machine made those choices or to examine the
raw intelligence data on which they were based. One source stated that
human personnel often served only as a "rubber stamp" for the machine's
decisions, adding that, normally, they would personally devote only
about "20 seconds" to each target before authorizing a bombing --- just
to make sure the Lavender-marked target is male. This was despite
knowing that the system makes what are regarded as "errors" in
approximately 10 percent of cases, and is known to occasionally mark
individuals who have merely a loose connection to militant groups, or no
connection at all.

Moreover, the Israeli army systematically attacked the targeted
individuals while they were in their homes --- usually at night while
their whole families were present --- rather than during the course of
military activity. According to the sources, this was because, from what
they regarded as an intelligence standpoint, it was easier to locate the
individuals in their private houses. Additional automated systems,
including one called "Where's Daddy?" also revealed here for the first
time, were used specifically to track the targeted individuals and carry
out bombings when they had entered their family's
residences.

The result, as the sources testified, is that thousands of Palestinians
--- most of them women and children or people who were not involved in
the fighting --- were wiped out by Israeli airstrikes, especially during
the first weeks of the war, because of the AI program's
decisions.

"We were not interested in killing [Hamas] operatives only when they
were in a military building or engaged in a military activity," A., an
intelligence officer, told +972 and Local Call. "On the contrary, the
IDF bombed them in homes without hesitation, as a first option. It's
much easier to bomb a family's home. The system is built to look for
them in these situations."

The Lavender machine joins another AI system, "The Gospel," about which
information was revealed in a previous
investigation

by +972 and Local Call in November 2023, as well as in the Israeli
military's own
publications.
A fundamental difference between the two systems is in the definition of
the target: whereas The Gospel marks buildings and structures that the
army claims militants operate from, Lavender marks people --- and puts
them on a kill list.

In addition, according to the sources, when it came to targeting
alleged junior militants marked by Lavender, the army preferred to only
use unguided missiles, commonly known as "dumb" bombs (in contrast to
"smart" precision bombs), which can destroy entire buildings on top of
their occupants and cause significant casualties. "You don't want to
waste expensive bombs on unimportant people --- it's very expensive for
the country and there's a shortage [of those bombs]," said C., one of
the intelligence officers. Another source said that they had personally
authorized the bombing of "hundreds" of private homes of alleged junior
operatives marked by Lavender, with many of these attacks killing
civilians and entire families as "collateral
damage."

In an unprecedented move, according to two of the sources, the army
also decided during the first weeks of the war that, for every junior
Hamas operative that Lavender marked, it was permissible to kill up to
15 or 20 civilians; in the past, the military did not authorize any
"collateral damage" during assassinations of low-ranking militants. The
sources added that, in the event that the target was a senior Hamas
official with the rank of battalion or brigade commander, the army on
several occasions authorized the killing of more than 100 civilians in
the assassination of a single commander.

The following investigation is organized according to the six
chronological stages of the Israeli army's highly automated target
production in the early weeks of the Gaza war. First, we explain the
Lavender machine itself, which marked tens of thousands of Palestinians
using AI. Second, we reveal the "Where's Daddy?" system, which tracked
these targets and signaled to the army when they entered their family
homes. Third, we describe how "dumb" bombs were chosen to strike these
homes.

Fourth, we explain how the army loosened the permitted number of
civilians who could be killed during the bombing of a target. Fifth, we
note how automated software inaccurately calculated the amount of
non-combatants in each household. And sixth, we show how on several
occasions, when a home was struck, usually at night, the individual
target was sometimes not inside at all, because military officers did
not verify the information in real time.

STEP 1: GENERATING TARGETS

'Once you go automatic, target generation goes crazy'

In the Israeli army, the term "human target" referred in the past to a
senior military operative who, according to the rules of the military's
International Law Department, can be killed in their private home even
if there are civilians around. Intelligence sources told +972 and Local
Call that during Israel's previous wars, since this was an "especially
brutal" way to kill someone --- often by killing an entire family
alongside the target --- such human targets were marked very carefully
and only senior military commanders were bombed in their homes, to
maintain the principle of proportionality under international
law.

(max char reached). Read the entire article here (mirror)

Rentlar ,

There appear to be a number of systems here.

The previously reported AI system "The Gospel" tracked buildings that presumably had targets. "Lavender" is a newly discovered AI system that tracked people who had displayed similar characteristics as targets. This system tended to mark non-combatant civil workers for the Hamas government among other mistakes, but was understood to have a 90% accuracy rate from a manually acquired sample. The threshold for valid targets changed from day to day depending on how many targets the higher command wanted. A third system called "Where's Daddy?" followed input targets so that they could efficiently find and kill them, along with their family, children and the other uninvolved families that happened to be there. This appears to be a matter of convenience for the IDF.

Intelligence personnel tended to just copy and paste (the 90% accurate) Lavender system output directly into the Where's Data System. Typically the only check prior to strike authorization was whether the target was male, but this didn't consider that the rest of the people in the target building would be mostly women and children, and in some instances the intended target had fled or avoided the area by the time the strike occurred.


The Nazi Party kept better records and had more oversight of their systematic genocide campaign 80 years ago.

Siegfried ,

Something so irresponsible as letting an AI decide where to drop bombs should be added to the war crime list

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@lemmy.world
  • incremental_games
  • meta
  • All magazines