How carders try to bypass Device Fingerprinting?

Mutt

Professional
Messages
1,371
Reaction score
914
Points
113
For educational purposes, I will provide a more detailed explanation of how anti-fraud systems collect Device Fingerprinting, how carders try to bypass them, and why these attempts are often detected through anomalies. I will focus on the technical aspects, carder methods, and protection mechanisms, while maintaining the focus on the context of carding (credit card fraud). It is important to note that this information is intended solely for understanding the protection mechanisms.

1. How antifraud systems collect device fingerprints​

Device Fingerprinting is the process of creating a unique device or browser identifier based on a variety of parameters that are collected both passively (without user interaction) and actively (via JavaScript, API, and other technologies). Anti-fraud systems used by banks, payment systems, and online stores collect data to identify users and detect suspicious activity. Here's a detailed breakdown of what data is collected and how:

1.1 Browser characteristics​

  • User-Agent: A string sent by the browser in the HTTP headers that contains information about the browser type (Chrome, Firefox, Safari), its version, the operating system (Windows, macOS, Linux, Android), and sometimes the device type (mobile, desktop). For example: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/94.0.4606.81 Safari/537.36.
  • Language settings: Browser language (e.g. en-US, ru-RU) and regional settings (date, time, number format), accessible via JavaScript (navigator.language or navigator.languages).
  • Screen and window resolution: Screen size (screen.width, screen.height), color depth (screen.colorDepth), browser window size (window.innerWidth, window.innerHeight). These parameters depend on the device and its settings.
  • Plugins and extensions: Through navigator.plugins, antifraud systems can obtain a list of installed plugins (e.g. Adobe Flash, Java). The absence of plugins or their unusual combination is also taken into account.
  • Fonts: The list of fonts available on the system is collected via JavaScript (e.g. FontFaceSet API or text rendering check). Different OS and devices have unique font sets, making this parameter important for identification.
  • Canvas Fingerprinting: Anti-fraud systems use HTML5 <canvas> to render images or text that depend on the video card, drivers, and OS. Even small differences in rendering create a unique hash. For example, a test might involve drawing a complex shape with gradients and text.
  • WebGL Fingerprinting: Similar to canvas, the WebGL API is used to render 3D graphics. The result depends on the graphics card, drivers and their settings, which makes the fingerprint unique.
  • AudioContext Fingerprinting: The AudioContext API generates a unique fingerprint based on audio processing (such as generating a sine wave) that depends on the audio drivers and hardware.
  • HTTP Headers: The order and values of headers (e.g. Accept, Accept-Encoding, Accept-Language) may vary depending on the browser and its settings.

1.2. Hardware specifications​

  • Operating System: Determined via User-Agent or specific APIs such as navigator.platform.
  • Hardware: Processor performance, RAM size, and other parameters can be estimated through JavaScript benchmarks (e.g., calculation or rendering). Some systems use APIs such as navigator.hardwareConcurrency to determine the number of processor cores.
  • Network parameters: IP address, geolocation (according to GeoIP databases), connection type (Wi-Fi, mobile Internet, wired connection), provider. Network delay (latency) is also analyzed through server requests.
  • Battery (less common): If available via the Battery API, anti-fraud systems may collect data about the battery charge level or device operating time.

1.3. Behavioural data​

  • Input patterns: Typing speed, intervals between keystrokes, mouse movements (trajectory, speed, acceleration), page scrolling. This data is collected via JavaScript events (mousemove, keydown, scroll).
  • Navigation patterns: The sequence of page transitions, time spent on the site, interactions with forms (e.g. filling in fields, clicking buttons).
  • Activity history: Anti-fraud systems can analyze how often a device interacts with a site, which pages are visited, and compare this with typical behavior.

1.4. Additional parameters​

  • Cookies and local storage: Unique identifiers stored in the browser via cookies, localStorage or sessionStorage.
  • Time Zone and Time: Compared with IP geolocation to detect discrepancies.
  • TLS Fingerprinting: Analysis of TLS connection parameters (e.g. protocol version, cipher suite) that depend on the browser and OS.

1.5 How data is combined​

Anti-fraud systems such as ThreatMetrix, Forter or Sift create a unique hash or identifier based on a combination of all these parameters. For example, even if two devices use the same browser and OS, differences in fonts, canvas fingerprint, behavior or geolocation make their fingerprints unique. This data is stored in databases and compared every time a user logs in to identify changes or suspicious activity.

2. How carders try to bypass Device Fingerprinting​

Carders who commit fraudulent transactions with bank cards aim to either completely hide their fingerprint or to forge it so that it looks like the fingerprint of a legitimate user (e.g. the card owner). Here is a detailed overview of their methods:

2.1 Using Virtual Machines (VM)​

  • Method: Carders create virtual machines (e.g. VirtualBox, VMware, QEMU) with a "clean" operating system (usually Windows or Linux) to emulate a new device with no history. After each fraudulent operation, the virtual machine can be reset or deleted.
  • Example: Using a Windows 10 image with a pre-installed browser and minimal settings to minimize unique settings.
  • Goal: Avoid accumulation of history (cookies, cache) and create the appearance of a new device.

2.2. User-Agent manipulation​

  • Method: Carders change the User-Agent string via browser extensions (such as User-Agent Switcher), developer settings, or specialized tools. They choose popular combinations, such as Chrome on Windows 10, to avoid being noticed.
  • Example: Substitute User-Agent for Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/94.0.4606.81 Safari/537.36.
  • Goal: Simulate a popular device and browser to match the profile of a typical user.

2.3. Anonymization networks (Tor, VPN, proxy)​

  • Method:
    • Tor: Used to mask your IP address and route traffic through multiple nodes, making it difficult to determine your real location.
    • VPN: Carders choose VPN servers in the same country as the card owner so that the geolocation matches the card data.
    • Proxies: Rotate residential proxies (IP addresses belonging to real devices, not data centers) to simulate different users.
  • Example: Using a VPN with a server in New York for a card registered in the US.
  • Purpose: To hide the real IP address and geolocation to avoid detection by location mismatch.

2.4. Antidetect browsers​

  • Method: Using specialized browsers such as Multilogin, Kameleo, Antidetect, which allow you to fake many device parameters:
    • Substitution of canvas fingerprint, WebGL, fonts, User-Agent, screen resolution.
    • Emulation of various operating systems, devices and geolocations.
    • Create a unique profile for each session to avoid cross-transaction linking.
  • Example: Setting up a profile that simulates an iPhone 13 with iOS 15 and Safari, with a specific set of fonts and geolocation.
  • Goal: Create a plausible fingerprint of a device that does not arouse suspicion.

2.5. Modifying the environment​

  • Method:
    • Disabling JavaScript or blocking certain APIs (Canvas, WebGL, AudioContext) via extensions such as uBlock Origin or NoScript.
    • Change system fonts, time zone, system language or other settings manually or using scripts.
    • Using "clean" browsers without history, cookies and cache.
  • Example: Set the time zone to match the map country and remove all fonts except the Windows standard ones.
  • Goal: Minimize the amount of data collected or fake it so that it appears natural.

2.6. Behavior Emulation​

  • Method: Using automated tools (bots) to simulate the behavior of a real user:
    • Generates natural mouse movements (random trajectories, pauses).
    • Simulates scrolling pages, filling out forms, clicking buttons.
    • Delays between actions to make the actions look "human".
  • Example: Using Selenium or Puppeteer to automate browser actions with predefined scripts.
  • Goal: Bypass behavioral analysis that checks patterns of interaction with a site.

2.7. Using real devices​

  • Method: Carders can use stolen or rented devices (e.g. via the darknet) to conduct transactions. This allows for a completely believable fingerprint to be created, as real hardware is used.
  • Example: Purchasing Remote Desktop (RDP) access on a device registered in the desired country.
  • Objective: To avoid suspicions related to virtual machines or fake fingerprints.

2.8. Profile rotation​

  • Method: Carders create multiple profiles (combinations of devices, IP addresses, browsers) and use them in turn to avoid linking transactions.
  • Example: Conduct one transaction with a profile that simulates a MacBook with macOS, and the next with a profile that simulates an Android device.
  • Purpose: To make it difficult to track and link fraudulent transactions.

3. Why are counterfeit attempts detected through anomalies?​

Anti-fraud systems use sophisticated algorithms, including machine learning, to analyze device fingerprints and detect anomalies. Even if carders use advanced counterfeiting methods, their actions are often detected for the following reasons:

3.1. Parameter inconsistencies​

  • Geolocation and time zone: If the IP address points to the USA, and the time zone is set to Moscow, this raises suspicions. Anti-fraud systems compare geolocation (by GeoIP) with the device settings.
  • User-Agent and other parameters: A fake User-Agent may not match other characteristics. For example, if the User-Agent indicates Windows, but the fonts or canvas fingerprint are specific to Linux, this is an anomaly.
  • Canvas/WebGL mismatches: Canvas fingerprints are difficult to forge because rendering depends on a combination of graphics card, drivers, and OS. Anti-detection browsers can generate "plausible" fingerprints, but they often do not match real devices in the anti-fraud database.
  • TLS Fingerprinting: TLS connection parameters (such as the cipher order) are unique to each browser and OS. Fake profiles may use non-standard combinations that stand out.

3.2. Missing or redundant data​

  • Disabling JavaScript: If JavaScript is disabled or APIs (Canvas, WebGL) are blocked, it is suspicious as regular users rarely use such settings.
  • "Clean" device: Devices with no cookies, cache or history appear like new, which is not typical for legitimate users who usually have accumulated data.
  • Identical fingerprints: If multiple transactions use identical fingerprints (e.g. due to mass use of one profile in the anti-detect browser), this raises an alarm.

3.3. Behavioral anomalies​

  • Unnatural Patterns: Bots used to emulate behavior may have overly linear mouse paths, no pauses, or unrealistic typing speeds.
  • Frequent profile changes: If one device or account uses different IP addresses, browsers, or configurations in a short period of time, this indicates fraud.
  • Transaction patterns: Carders often conduct transactions at high speeds or in unusual volumes (e.g. purchasing expensive items immediately after registration), which is inconsistent with the behavior of regular users.

3.4. Characteristics of virtual machines​

  • Limited settings: Virtual machines often have default settings (e.g. fixed screen resolution, no battery, limited fonts) that differ from real devices.
  • Drivers and performance: Virtual machines can reveal themselves through poor performance (e.g. in JavaScript benchmarks) or specific drivers (e.g. VirtualBox Graphics Adapter).
  • Lack of diversity: If multiple transactions use the same virtual machine parameters, this becomes obvious to anti-fraud systems.

3.5. Tor and proxies​

  • Known IP addresses: Anti-fraud systems have databases of IP addresses associated with Tor, VPN or proxy servers. Using such IPs immediately marks the transaction as suspicious.
  • Data centers vs. residential IPs: Proxies from data centers (not from real providers) are easily identified because their IP addresses are different from typical user IPs.
  • Frequent IP rotation: Changing IP addresses too frequently within a single session or account is a cause for concern.

3.6. Comparison with historical data​

  • Fingerprint change: Anti-fraud systems store fingerprint history for each user. If the device suddenly changes parameters (for example, changing OS, browser or canvas fingerprint), this is considered an anomaly.
  • Transaction Linking: If one device fingerprint is used for multiple accounts or transactions, this indicates fraud.

3.7. Machine learning and behavioral analysis​

  • Big data learning: Anti-fraud systems use machine learning algorithms to analyze millions of fingerprints and identify patterns that are typical for fraudsters. For example, they can detect that certain combinations of parameters (e.g. rare fonts + Tor) are found only in carders.
  • Profile anomalies: Even if a fingerprint appears legitimate, machine learning can identify inconsistencies based on statistical models of "normal" behavior.
  • Risk scoring: Anti-fraud systems assign each transaction a risk score based on a combination of factors (fingerprint, behavior, geolocation). A high score leads to additional checks (e.g. 3D-Secure or request for additional authentication).

4. Why a complete fake is almost impossible​

Complete counterfeiting of Device Fingerprinting is extremely difficult for the following reasons:
  1. Difficulty of combinations:
    • The fingerprint consists of dozens or hundreds of parameters, and counterfeiting each of them requires deep technical knowledge. For example, counterfeiting a canvas fingerprint requires emulating not only the browser, but also the video card, drivers, and OS.
    • Even small discrepancies (such as a rare order of HTTP headers) can give away a fake.
  2. Dynamic adaptation of antifraud systems:
    • Anti-fraud systems constantly update their data collection methods and analysis algorithms. For example, they can implement new APIs or tests (for example, checking touch events for mobile devices) that carders do not have time to fake.
    • The use of machine learning allows systems to adapt to new evasion methods.
  3. Contextual analysis:
    • Even if the fingerprint looks legitimate, anti-fraud systems analyze the context of the transaction. For example, buying an expensive item from a new device in a country different from the card country raises suspicions.
    • Transaction history (such as multiple attempted purchases with different cards) can give away the carder, even if the fingerprint is faked.
  4. Resources and costs:
    • Forging a fingerprint requires significant resources: purchasing residential proxies, setting up anti-detection browsers, using real devices or complex virtual environments.
    • For mass carding (e.g. checking thousands of cards) such methods become economically ineffective.
  5. Multi-level protection:
    • Anti-fraud systems combine Device Fingerprinting with other protection methods such as 3D-Secure, transaction analysis, biometrics (e.g. facial recognition) and card data verification. This creates additional barriers for carders.

5. Examples of real scenarios​

  1. Scenario 1: Using Tor:
    • The carder uses Tor to mask the IP and an anti-detection browser to fake the canvas fingerprint. However, the anti-fraud system notices that the IP belongs to a known Tor node, and the canvas fingerprint does not match the typical one for the declared OS (e.g. Windows). The transaction is rejected.
  2. Scenario 2: Virtual Machine:
    • The carder conducts a transaction from a virtual machine configured as Windows 10 with Chrome. The anti-fraud system detects no battery, low performance, and a standard set of fonts typical for VirtualBox. The transaction is marked as suspicious.
  3. Scenario 3: IP Rotation:
    • The carder uses residential proxies to rotate IP addresses to create the appearance of different devices. However, the anti-fraud system notices that all IPs belong to the same provider or are used for a short time, which does not correspond to normal behavior.
  4. Scenario 4: Behavioral Anomalies:
    • The carder uses a bot to fill out the payment form. The anti-fraud system detects that the mouse movements are too linear and the time it takes to fill out the form is unrealistically short. This leads to an additional check (for example, a request for a 3D-Secure code).

6. Conclusion​

Anti-fraud systems collect device fingerprints through complex combinations of browser, hardware, and behavioral data, creating unique identifiers that are difficult to counterfeit. Carders use virtual machines, anti-detection browsers, Tor, VPN, proxies, and other methods to bypass Device Fingerprinting, but these attempts are often detected due to parameter inconsistencies, behavioral anomalies, and big data analysis. Machine learning and multi-layered protection make complete fingerprint counterfeiting virtually impossible, especially on a mass scale. For educational purposes, it is important to understand that anti-fraud systems are constantly improving, and carding remains a high-risk and illegal activity that entails serious consequences.
 
Top