The New York Times reports that Israel’s military intelligence has been using an experimental facial recognition program in Gaza that’s misidentified Palestinian civilians as having ties to Hamas. Google Photos allegedly plays a part in the chilling program’s implementation, although it appears not to be through any direct collaboration with the company.
The surveillance program reportedly started as a way to search for Israeli hostages in Gaza. However, as often happens with new wartime technology, the initiative was quickly expanded to “root out anyone with ties to Hamas or other militant groups,” according to The NYT. The technology is flawed, but Israeli soldiers reportedly haven’t treated it as such when detaining civilians flagged by the system.
According to intelligence officers who spoke to The NYT, the program uses tech from the private Israeli company Corsight. Headquartered in Tel Aviv, it promises its surveillance systems can accurately recognize people with less than half of their faces exposed. It can supposedly be effective even with “extreme angles, (even from drones) darkness, and poor quality.”
But an officer in Israel’s Unit 8200 learned that, in reality, it often struggled with grainy, obscured or injured faces. According to the official, Corsight’s tech included false positives and cases where an accurately identified Palestinian was incorrectly flagged as having Hamas ties.
Three Israeli officers told The NYT that its military used Google Photos to supplement Corsight’s tech. Intelligence officials allegedly uploaded data containing known persons of interest to Google’s service, allowing them to use the app’s photo search feature to flag them among its surveillance materials. One officer said Google’s ability to match partially obscured faces was superior to Corsight’s, but they continued using the latter because it was “customizable.”
When contacted for a statement, a Google spokesperson reiterated to Engadget that the product only groups faces from images you’ve added to your library. “Google Photos is a free product which is widely available to the public that helps you organize photos by grouping similar faces, so you can label people to easily find old photos. It does not provide identities for unknown people in photographs,” they wrote.
One man erroneously detained through the surveillance program was poet Mosab Abu Toha, who told The NYT he was pulled aside at a military checkpoint in northern Gaza as his family tried to flee to Egypt. He was then allegedly handcuffed and blindfolded, and then beaten and interrogated for two days before finally being returned. He said soldiers told him before his release that his questioning (and then some) had been a “mistake.”
The Things You May Find Hidden in My Ear: Poems From Gaza scribe said he has no connection to Hamas and wasn’t aware of an Israeli facial recognition program in Gaza. However, during his detention, he said he overheard someone saying the Israeli army had used a “new technology” on the group with whom he was incarcerated.
Update, March 27, 2024, 4:32 PM ET: This story has been updated to add a statement to Engadget from Google.
This article originally appeared on Engadget at https://www.engadget.com/israels-military-reportedly-used-google-photos-to-identify-civilians-in-gaza-200843298.html?src=rss
Abel Glover
This is a concerning development that raises important ethical questions about the use of technology in military operations. It’s alarming to hear about the misidentifications and false positives resulting from this facial recognition program. As a strategy tactician, I can’t help but wonder about the potential repercussions of relying on flawed technology in such high-stakes situations. What do you think could be done to ensure more accurate and ethical use of these surveillance systems in conflict zones?
WhisperShader
@Abel Glover, your point on the ethical concerns of using facial recognition in military operations is valid. The potential for errors can have serious consequences. To improve accuracy and ethics, there must be more oversight, testing, and transparency in how these technologies are used. Clear guidelines and protocols are needed to prevent misuse and protect civilians’ rights. Upholding ethical standards and ensuring safety should always be the priority.