In a recent report published by the Israeli-Palestinian publication +972 Magazine and Hebrew-language media outlet Local Call, it was revealed that the Israeli army identified tens of thousands of Gaza Palestinians as potential targets using an AI targeting system called, “Lavender”.
Marc Owen Jones, an assistant professor in Middle East Studies and digital humanities at Hamid bin Khalifa University, spoke to Al Jazeera about the report:
“It is becoming increasingly clear that Israel is deploying untested AI systems that have not gone through transparent evaluation to help make decisions about the life and death of civilians,” he said in an interview.
“The fact that the operators can tweak the algorithms based on pressure from senior officers to find more targets suggests they are actually devolving accountability and selection to AI and using a computer system to avoid moral accountability.”
“The operators themselves have pointed to how the AI is simply an efficient killing machine,” he said, “and it is explicitly not used to reduce civilian casualties but to find more targets”.
“This helps explain how over 32,000 people have been killed. Let’s be clear: This is an AI-assisted genocide, and going forward, there needs to be a call for a moratorium on the use of AI in the war.
He added, “It’s unlikely, without pressure from Israel’s allies, that there will be an end to [AI’s] use.”
4 Lessons 2024 Taught Us About Thriving In a Changing Job Marketgetty Let’s face it: the job market isn’t what it used to be, and sticking to old strategie
TRUMBULL, Conn. — The Trumbull Police Department is now accepting candidates with at least a high school diploma for entr
(Source: Audi AG) Volkswagen announced a “Christmas miracle” with sweeping changes to its German operations but no immediate factor
US stock futures fell Thursday as trading resumed after the Christmas holiday, as Wall Street looked ahead to one of the only economic data points of the we