Google Photos as a tool for waging war

Teacher

Professional
Messages
2,672
Reputation
9
Reaction score
698
Points
113
How did the Israeli army adapt the popular media service to its needs?

According to The New York Times, Israel's military intelligence is actively using the publicly available Google Photos service in the Gaza Strip as an alternative to the existing professional facial recognition system.

Israel recently began using facial recognition technology to search for hostages in Gaza, but soon expanded its use to look for any links to Hamas or other militant groups.

Developed by the private Israeli company Corsight, this technology promised high accuracy of face recognition even if they partially hit the frame. Despite this, cases of false positives of this commercial system have already been recorded, which has contributed to the arrests of civilians due to their erroneous identification as affiliated with Hamas.

In other words, the Corsight technology was not perfect enough, but the Israeli soldiers did not take this into account for a long time and fully trusted it. When they realized that the recognition error was too high, they started using Google Photos in addition to the main system, uploading data about faces that needed to be identified to the service.

One of the officers noted that Google's facial recognition capabilities are superior to those of Corsight, but the latter has flexible settings, so it is still used by the Israeli military.

A Google spokesperson said that Google Photos is a publicly available product designed to organize photos by grouping similar faces, but it does not provide identification of unknown faces in photos. This statement largely removed from Google the suspicion of the public that the corporation's product has some hidden functions that ordinary users should worry about.

From this story, we can conclude that you can not completely rely on facial recognition technologies, as they are imperfect and can give false results. And the misuse of such technologies for military purposes can even lead to irreparable consequences.

The conclusions of automated systems, whether they are facial recognition systems or smart assistants, should always be treated critically and always check their accuracy. Earlier, we wrote about how American residents suffered massively from poisoning with poisonous mushrooms, as they blindly trusted a specialized AI application when collecting mushrooms in the forest.
 
Top