Algorithms in the clutches of scammers: How social media is becoming a recruiting factory for financial pyramid schemes and manipulation

Professor

Professional
Messages
1,068
Reaction score
1,265
Points
113

Introduction: When the News Feed Becomes a Battlefield​

Imagine scrolling through your TikTok or Instagram feed after a hard day. Between funny videos and friends' posts, a bright banner flashes: "Earn $1,000 in a month without doing anything!" or "The secret crypto scheme that banks are hiding!" You scroll further, but the algorithm has remembered your one-second delay. Tomorrow, there will be more such offers. The day after, you'll see an emotional review from a "happy investor." This isn't a coincidence. It's the result of algorithms that are increasingly falling into the hands of digital scammers, turning social media into a recruiting ground for dubious schemes and the spread of manipulative information.

Part 1: The Anatomy of Abuse – How Exploitation Mechanisms Work​

1.1 Algorithmic Vulnerability: Predictability on the Verge of Manipulation​

TikTok's (For You Page) and Instagram's (Reels, Explore) algorithms are built on the principle of maximizing engagement. They learn from your every action: pause, like, or watch to the end. Fraudsters, often with marketing expertise, create content that triggers key emotions:
  • Greed: Get - Rich-Quick Stories.
  • Fear: "Banks will collapse, only crypto will save them!"
  • Curiosity: “I’m revealing a secret that was banned in the media!”
  • Social proof: fake reviews and staged videos of "luxury living."

The system, seeing high engagement, begins to recommend such content en masse to similar audiences.

1.2. Targeting Techniques: Sniper Search for Victims​

Fraudsters use:
  • Lookalike audiences: by uploading data from already engaged users, they ask the algorithm to find similar ones.
  • Interest targeting: fine-tuning your reach to those interested in cryptocurrencies, quick money, and financial advice.
  • Temporal and behavioral targeting: serving content on Sunday evenings when people are more reflective, or during periods of economic instability.

1.3. Escalating Engagement: From Viewing to Action​

The user's path is built like a funnel:
  1. Attracting people with viral, seemingly harmless content (memes about poverty, success stories of "ordinary guys").
  2. Deepening insights through a series of videos that create the illusion of expertise and trust.
  3. Redirection to closed Telegram chats, fake landing pages, and private webinars.
  4. Monetization – direct sales of “courses,” fundraising in a “pool,” recruiting into a pyramid scheme.

Part 2: Newsbreaks as a Weapon of Mass Attraction​

Fraudsters don't just sell air. They create and exploit newsworthy events to generate a stream of potential victims.
  • Example 1: Crypto hype. Riding the wave of hype around a new cryptocurrency, hundreds of accounts posing as experts are created. They spread fake news ("This coin will skyrocket after listing on an exchange!") to pump and dump the asset.
  • Example 2: Social discontent. During periods of rising inflation or unemployment, campaigns are launched offering "earnings through de-dollarization" or "stable income during a crisis."
  • Example 3: Fake grants and government support. News stories are created about non-existent assistance programs that require an "entry fee" to access.

These news hooks fit perfectly into the logic of algorithms that promote trending and emotionally charged topics.

Part 3: Scale and Consequences – More Than Just Financial Loss​

  • Financial damage: According to the FTC, more than $2.7 billion was scammed from users worldwide through social media in 2023 alone.
  • Data as currency: often the goal is not a direct transfer of money, but the collection of bank data and passport information for subsequent sale.
  • Social erosion: the widespread spread of financially illiterate behavior patterns, the erosion of trust in all financial institutions, and the rise of social anxiety.
  • Legitimizing the scam: When the same "guru" appears in the feeds of several acquaintances, it creates a false impression of the scheme's legitimacy.

Part 4: Is the Algorithm Responsible? Legal and Ethical Dilemmas​

The key question is: who is to blame? The bot account, its creator, or the platform that provided the tools for viral spread?
  • Platforms' position: Meta* and TikTok claim to be fighting back. They use AI to scan content for keywords ("guaranteed income," "quick money"), block ads for financial pyramid schemes, and work with fact-checkers. But their system reacts after the fact, and their scale doesn't allow them to catch everything.
  • The "cat and mouse" problem: scammers constantly change their wording, use encryption ("ice" instead of "crypto," emoji instead of words), and switch to live broadcasts, which are more difficult to moderate.
  • Legal vacuum: legislation on the digital economy and algorithmic liability has not kept pace with technology. Proving a platform's guilt when its algorithm inadvertently promotes fraud is extremely difficult.

Part 5: Practical Self-Defense: How to Avoid Becoming a Target of Scammers' Algorithms​

  1. Critical perception: Any promise of super-profits with minimal effort is a red flag.
  2. Source verification: no verified website, no legal information, no real reviews outside of social media—no trust.
  3. Technical settings: Limit targeted advertising in your privacy settings and disable in-app tracking.
  4. Educational immunity: A basic understanding of how financial markets work and classic scams (Ponzi schemes) is the best defense.
  5. Report: Use the platforms' built-in features to report fraudulent content. This also trains the moderation algorithm.

Conclusion: The future lies in cooperation​

The problem of using algorithms for fraud is systemic. It can't be solved by targeted blocking. Cooperation is needed:
  • Platforms should implement predictive rather than reactive moderation and increase algorithmic transparency in key areas.
  • Regulators should develop digital legislation that establishes clear boundaries of responsibility.
  • Educational institutions should implement digital and financial literacy courses.
  • Users - develop digital hygiene and skeptical thinking.

Algorithms are a mirror reflecting both the best and worst aspects of human behavior. Society's task is to ensure that this mirror doesn't become a tool for deception, but remains a tool for connection and knowledge. For now, every social media session is not only a relaxation experience but also a test of vigilance.
 
Top