Report: Israel used artificial intelligence tool called Lavender to select targets in Gaza Strip

According to an investigation by Israeli publications, the Israeli military is using artificial intelligence to select targets for bombing in the Gaza Strip, sacrificing accuracy in favor of speed and killing thousands of civilians. +972 Magazine And Local call.

The report claims that the system, called “Lavender”, was developed after the Hamas terrorist attacks on October 7. At its peak, Lavender marked 37,000 Palestinians in Gaza as suspected “Hamas militants” and sanctioned their killings.

The Israeli military denied the existence of such a list of killings in a statement to +972 And Local call. A company spokesperson told CNN. that the AI ​​was not used to identify suspected terrorists, but did not dispute the existence of Lavender’s system, which the spokesman called “simply tools for analysts in the process of identifying a target.” Analysts “are required to conduct independent reviews in which they verify that identified targets comply with relevant definitions under international law and additional restrictions provided in IDF directives,” the spokesperson told CNN. The Israel Defense Forces did not immediately respond to Edgerequest for comments.

In an interview with +972 And Local callHowever, Israeli intelligence officers stated that they were not required to conduct independent studies of Lavender’s targets before bombing them, but instead effectively served as a “rubber stamp” for the machine’s decisions. In some cases, the officers’ only role in this process was to determine whether the target was male.

Selecting goals

To create Lavender’s system, information on known Hamas and Palestinian Islamic Jihad militants was entered into the dataset, but according to one source who worked with the data team that trained Lavender, there was also data on people loosely associated with Hamas, such as as employees of the Gaza Ministry of Internal Security. “What bothered me was that when Lavender was being trained, they used the term ‘Hamas fighter’ loosely and included in the training data set people who were civil defense workers,” the source said. +972.

Lavender was taught to recognize “traits” associated with Hamas militants, including being in a WhatsApp group with a known militant, changing mobile phones every few months or changing addresses frequently. This data was then used to rank other Palestinians in the Gaza Strip on a scale of 1 to 100 based on how similar they are to known Hamas militants in the original dataset. People who reached a certain threshold were then marked as targets for strikes. That threshold has changed constantly “because it depends on where you set the bar for what a Hamas militant is,” one military source said. +972.

The system had an accuracy rate of 90 percent, the sources said, meaning that about 10 percent of the people identified as Hamas militants were not members of Hamas’s military wing at all. Some of the people Lavender identified as targets had names or nicknames identical to those of known Hamas militants; others were relatives of Hamas fighters or people who used phones that once belonged to a Hamas fighter. “The errors were processed statistically,” said a source who used Lavender. +972. “Because of the scale and scope, the protocol was that even if you don’t know for sure that the machine is OK, statistically you know there’s nothing wrong with it. So go for it.”

Collateral damage

Intelligence officers were given wide latitude when it came to civilian casualties, sources said. +972. During the first few weeks of the war, officers were allowed to kill up to 15 or 20 civilians for every low-level Hamas fighter attacked by Lavender; The report claims that the military authorized “hundreds” of collateral civilian casualties against senior Hamas officials.

Suspected Hamas militants were also targeted in their homes through a system called “Where’s the Daddy?” officers told +972. The system put targets created by Lavender under constant surveillance, tracking them until they reached their homes – at which point they were bombed, often along with their entire families, the officers said. However, officers sometimes bombed houses without checking to see if targets were inside, killing dozens of civilians in the process. “It’s happened to me many times where we attack a house and the person isn’t even home,” one source said. +972. “As a result, you killed a family for no reason.”

War controlled by artificial intelligence

Mona Shtaya, a research fellow at the Tahrir Institute for Middle East Policy, spoke about this. Edge that the Lavender system is a continuation of Israel’s use of surveillance technology against Palestinians in both the Gaza Strip and the West Bank.

Shtaya, who lives in the West Bank, said Edge that these tools are particularly troubling in light of reports that Israeli defense startups I hope for export its battle-tested technologies abroad.

Since the start of Israel’s ground offensive in Gaza, the Israeli military has used and developed a variety of technologies to identify and attack suspected Hamas militants. In March, The newspaper “New York Times reported that Israel introduced mass facial recognition program in the Gaza Strip – creating a database of Palestinians without their knowledge or consent – which the military then used to identify suspected Hamas militants. In one case, a facial recognition tool identified Palestinian poet Mosab Abu Toha as a suspected Hamas militant. Abu Toha was held in an Israeli prison for two days, where he was beaten and interrogated, and then returned to Gaza.

Another artificial intelligence system, called “Gospel”, was used to identify buildings or structures from which Hamas is believed to operate. According to +972 And Local call report Since November, the gospel has also contributed to huge numbers of civilian casualties. “When a three-year-old girl is killed in a house in Gaza, it is because someone in the army decided that killing her was not a big deal – that it was a price worth paying to strike. [another] target,” a military source told the publications at the time.

“We must see this as a continuation of the policy of collective punishment that has been used as a weapon against Palestinians for decades,” Shtaya said. “We must ensure that time of war is not used to justify mass surveillance and the mass killing of people, especially civilians, in places like Gaza.”

Source link

Leave a Comment