The new method allows you to bypass facial recognition systems and impersonate someone else

Father

Professional
Messages
2,601
Reputation
4
Reaction score
638
Points
113
The new attack differs from other adversarial attacks in that it only disguises the person in the photo, and does not transform him into someone else.

Specialists of the Israeli company Adversa AI, which develops artificial intelligence (AI) technologies, presented a new method of deceiving face recognition systems by adding so-called noise to photos. This noise is tiny bits of data that cannot be seen with the naked eye, but which is enough to make the facial recognition system believe that another person is depicted in the photo. In particular, the researchers demonstrated how they managed to force the PimEyes face recognition system to mistake the head of Adversa AI, Alex Polyakov, for Elon Musk.

Adversarial attacks are improving every year, as are the ways to defend against them. However, the Adversarial Octopus attack presented by Adversa AI differs from them for a number of reasons.

First, the Adversarial Octopus only disguises the person depicted in the photo, and does not turn him into someone else. Second, instead of adding noise to the image data on which the AI models are trained (the so-called poisoning attack), the new method involves making changes to the image that will be entered into the face recognition system, and does not require internal knowledge of how this was trained. system.

The authors of Adversarial Octopus have not yet published a scientific article with a full explanation of the attack. Researchers will only provide details about it after completing the process of responsibly disclosing the vulnerability to facial recognition system developers.

 
Top