Who owns a face-owns the world: deepfakes allow you to see what biometrics systems don't see

Carding

Professional
Messages
2,870
Reaction score
2,494
Points
113
Technologies are moving forward as well as ways to bypass authentication systems.

Biometric identification technologies are gaining popularity among companies seeking to provide a fast and reliable authentication process. However, as Stuart Wells, CTO of Jumio, a biometric authentication company, warns, the risk of fraudsters circumventing such systems is increasing.

Europol predicts that by 2026, up to 90% of online content can be generated artificially, which creates problems for accurate identification of users.

Wells describes a technique called Camera injection, which allows you to inject deepfake video into the system and trick biometric identification tools. Camera injection occurs when an attacker inserts a signal into a CCD sensor (charge-coupled device, CCD) of a camera image to distort or replace the captured image, and enters pre-recorded material or live streaming video with face replacement created using deepfake technology into the system.

The pre-recorded content can be a real video of the victim, or a video in which the victim's face is altered in some way, or a fully generated face.

There are several ways to bypass the live broadcast, which is usually captured by the CCD sensor of a real camera. One of them is to hack the device driver of a real camera and inject the video stream to a lower level of the device driver. A more common way to implement a camera is to have a virtual camera device driver that simply transmits a pre-recorded or real-time generated video stream to the system, presenting it as an image from a real camera.

Because the video is a series of still images, the fraudster sometimes inserts the same image into each frame of the video stream. This results in a video stream with no movement. A more complex method, but also more time — consuming for scammers, is to change or fabricate a video sequence in which there is movement. The most difficult method is that the deepfake can be manipulated in real time to perform actions requested by the integrated data viewer system.

The main danger of embedding is that if a fraudster successfully inserts such a video, it may go unnoticed. Such actions can lead to identity theft, creation of fake accounts, and fraudulent transactions.

According to Wells, to ensure security, it is necessary to implement mechanisms for detecting cases of compromised camera drivers and recognizing manipulations by comparing natural movement with movements on recorded videos.

The built-in accelerometer, which responds to movement along the axes, can also be used to track changes in objects in the recorded video and determine whether the camera has been hacked.

Other methods include comparing parameters such as ISO, aperture, frame rate, and changes in light conditions that can cause a fake image. Additionally, by analyzing individual video frames, you can detect signs of manipulation, such as double-compressed parts of the image or traces of deepfake images.
 
Top