Beaten, abused and traumatized: an American was the victim of a facial recognition system error

Brother

Professional
Messages
2,565
Reputation
3
Reaction score
363
Points
83
The high-profile case raised the question of the feasibility of using technologies.

A 61-year-old US resident has filed a $10 million lawsuit against major US retailer Macy's and EssilorLuxottica, which owns the Sunglasses Hut chain of stores. The reason for the claim was an illegal detention based on an inaccurate match in the facial recognition system, as a result of which a man was mistakenly arrested on charges of robbery and sexually assaulted in prison.

The events unfolded after two men committed an armed robbery of the Sunglasses Hut kiosk inside the Macy's store in 2022, stealing sunglasses and several thousand dollars in cash. While investigating the crime, Houston police used facial recognition software that mistakenly identified the suspect as Garvey Eugene Murphy Jr. using old photos of him. In addition to the artificial intelligence error, one of the store employees also recognized Murphy among the criminals in the photos provided.

Murphy was arrested while his driver's license was being renewed by the Department of Transportation and sent to an overcrowded maximum security prison with violent criminals. In prison, Murphy was beaten, raped and suffered permanent injuries for life. In a conversation with his lawyer, Murphy claimed that he was in California at the time of the robbery, which was soon confirmed, and after several hours in custody, all charges against him were dropped.

Murphy's lawsuit states that his case is tragic and frightening for all people. Facial recognition software, which is known for its bugs and high false positive rates, can lead to the wrongful prosecution and imprisonment of anyone.

Murphy accuses Macy's and EssilorLuxottica of malicious harassment, wrongful imprisonment, and gross negligence. He is seeking compensation in the amount of $10 million. We haven't received a response from both companies yet.

The human rights organization Fight for the Future (FFTF) said that Macy's confirmed the use of facial recognition software in this case. According to FFTF, the technology should not be used at all. FFTF said that private companies using facial recognition technology put customers in serious danger, and this case once again confirms what has long been known: the safe use of facial recognition systems does not exist – it should be prohibited.
 
Top