Radio Free never takes money from corporate interests, which ensures our publications are in the interest of people, not profits. Radio Free provides free and open-source tools and resources for anyone to use to help better inform their communities. Learn more and get involved at radiofree.org
Seg1 gaza bombing

The Israeli publications +972 and Local Call have exposed how the Israeli military used an artificial intelligence program known as Lavender to develop a “kill list” in Gaza that includes as many as 37,000 Palestinians who were targeted for assassination with little human oversight. A second AI system known as “Where’s Daddy?” tracked Palestinians on the kill list and was purposely designed to help Israel target individuals when they were at home at night with their families. The targeting systems, combined with an “extremely permissive” bombing policy in the Israeli military, led to “entire Palestinian families being wiped out inside their houses,” says Yuval Abraham, an Israeli journalist who broke the story after speaking with members of the Israeli military who were “shocked by committing atrocities.” Abraham previously exposed Israel for using an AI system called “The Gospel” to intentionally destroy civilian infrastructure in Gaza, including apartment complexes, universities and banks, in an effort to exert “civil pressure” on Hamas. These artificial intelligence military systems are “a danger to humanity,” says Abraham. “AI-based warfare allows people to escape accountability.”


This content originally appeared on Democracy Now! and was authored by Democracy Now!.

Citations

[1] Lavender & Where’s Daddy: How Israel Used AI to Form Kill Lists & Bomb Palestinians in Their Homes | Democracy Now! ➤ http://www.democracynow.org/2024/4/5/israel_ai