Cloned Boy
Professional
- Messages
- 1,074
- Reaction score
- 819
- Points
- 113
You think you're watching the Internet. But in reality, the Internet is watching you. While you're liking, googling, and just staring at the screen, algorithms are calculating your weaknesses, reading your emotions, and reselling your data.
In this topic, you'll learn how a person leaves digital traces and what you can learn about yourself from one comment. Why social networks know you better than your own mother, and how AIs are quietly molding your personality to suit someone's goals. You'll also learn the risks of subscribing without reading the terms and conditions, and how much your "privacy" costs on the dark market. And finally, who are these puppeteers who keep you on a leash, and is it possible to break this chain? Get ready, after this topic, you won't be able to say: "I have nothing to hide."
Contents:
In this topic, we will analyze how digital corporations make a product out of you, why social networks know more about you than your mother, what you can pay for being lazy to read the user agreement, and how much your data is worth on the black market. This is not a conspiracy theory, this is ordinary business. The mouse thinks that it is exploring a maze, but in fact, it is the maze that studies the behavior of the mouse. While you think that you are studying something on the Internet, you are being assessed, weighed, labeled and sold.
How do you leave traces on the network?
You take your phone, google the symptoms of appendicitis, connect to Wi-Fi in a cafe, watch YouTube at night, order 18 plus goods on marketplaces. All this seems innocent, but all of these are digital traces. Every action you take is a crumb on the digital floor.
Modern IT systems track what you do and how. These are behavioral markers, data that costs money. Every 5 minutes you spend online enriches someone with another million. You can be used for completely different purposes. Experiments, business, politics. But first, let's talk about what the Internet knows about you. There are 4 main types of data that you have left behind during your life.
Identification data.
1. Identification data. Your name, phone number, email, address, ENN, passport data. They have long been in databases, and not only of the state, but also of marketers, banks, advertising networks. Did you apply for a loan somewhere in 2018? Rented a car? Bought tickets online? Even if none of this, you probably have an account on government services. In 2022, a database of portal users surfaced on the Darknet.
2. Biometrics. Passport data, if necessary, can be changed, so to speak, to start from scratch, which cannot be said about biometrics. Getting your Face ID or fingerprint data means getting access forever to everything you've ever used or will use in the future using this identification method. Since it's impossible to change these parameters during your life without resorting to radical plastic surgery methods. Stealing
this data is no more difficult than stealing regular data. They just haven't learned how to use it yet. But that time is coming. Neural networks are rapidly learning to fake people. In 2023, PornHub featured over 20,000 deepfake videos with the faces of ordinary people who didn't give their consent. A program like Eleven Labs reproduces a person's voice from a 30-second recording.
Leaks from Clearview AI showed that they spend billions of photos from social networks and train AI on them without permission.
Technical data.
It's not just your network name, Wi-Fi, and browser history. It's a chain of thousands of little things that your device leaves on the Internet.
IP address, IMEI, phone model, website cookies, cache, logins, passwords, saved card numbers, all the information you send or receive. This is the most complete dossier on you, more valuable even than passport data.
How do your behavioral data "burn" you?
And at first glance, the most harmless is behavioral data. How much time do you spend on TikTok, what videos do you look at, what topics do you like, what do you write in private messages, what you write is one thing, but how you write is another.
Hot-tempered, measured, with mistakes, everything is taken into account. Even typing speed matters, if you collect all this data, then you have no private life, your entire personality is trifles scattered across the network. This data does not lie dead weight, it works, but not for you, and more often even against you, it is analyzed, compared, a digital portrait is compiled from them, and a set of characteristics and conditions.
You might say, so what, let them collect it, it doesn’t affect me. But what if I told you that by reading your personality, they can make you do anything, even choose a president.
How do algorithms manipulate you?
Imagine that your like under a post was used to understand what you were afraid of or what you dreamed of, to instill fear or hatred in you, so that you would vote the right way.
This happened in 2016, when the American elections changed the idea of democracy forever. Cambridge Analytics was a private analytics firm associated with a British billionaire and political strategists. Its official mission was to use data and behavior to improve political communication. But in fact, it was a mind-hacking company.
They gained access to data from 87 million Facebook accounts and created psychoprofiles. You are afraid of migrants, you are giving in to the government, you feel lonely, you do not trust the elites, the algorithms see this from your reactions, your recommendation feed and your subscriptions. Research has shown that 300 likes can predict a person's character better than any psychological technique known today. Then the processing began, each type received a personal information attack.
Horror stories about Mexican crime, fakes and promises of exactly what you dream of, dirt on competitors. Was it a lie? Partially. Some news was made up, others were taken out of context. But it does not matter. The main thing is that the algorithm knew how to make a person think in the right direction. According to official data, more than 10 million Americans changed their minds after these campaigns, and Trump himself won by only 77 thousand votes in key states.
Coincidence? Interesting. These are the states where Cambridge Analytics was most active. Later, Facebook acknowledged the leak, and the company was fined 5 billion. Mark Zuckerberg stood before Congress with the face of a beaten dog and said, “We made a mistake, I’m sorry.”
Cambridge Analytics closed and most likely relaunched under some other name, and similar technologies have long been used everywhere.
What are your weaknesses that algorithms recognize?
In the world of digital manipulation, there are three golden keys to a person - their fears, their sexual preferences, and their addictions, because they are directly related to the subconscious, and therefore to behavior that a person cannot control.
Fear.
First, fear is the main driver of decisions. People most often act not out of desire, but out of anxiety. You do not want to lose security, status, health, control. Social media algorithms fuel anxiety, show you frightening content, create a sense of threat, and then offer a solution that is beneficial to them. TikTok research has shown that negative content is much more popular than positive and neutral content, and spreads 6 times faster.
And according to internal meta data, Frances Haugan leak 2021. Instagram knew it was harming the mental health of teenagers, especially girls. One of the quotes. "Instagram makes one in three teenage girls feel worse about their bodies." The company did nothing about it, but continued to increase engagement through alarming algorithms.
Sexual preferences.
This is the door to your inner world. This is something you won't tell a rumor. But the algorithms know everything without words. Instagram knows what bodies you like. Pornhub knows when you're alone and what you click on. Knows who to show you in ads and how to build trust and turn off your brain. Sex is a universal hook.
Addictions.
Addictions are a guarantee of predictability.
Just like cigarettes and sweets, you come back to a new video, to certain emotions, topics, pictures. The YouTube algorithm knows that you will come today and tomorrow and always for your dopamine rush, they train you like Pavlov's dog. And the better they know that you sit down, the easier it is to control your actions, from buying a Snickers to supporting the law on digital control.
Whose side is the Internet on?
Remember how in the early 2010s the Internet seemed like a territory of absolute freedom, where everyone could speak out? Now this utopia looks like a strange joke.
Of course, you can try to promote your wrong opinion, but it will be like shouting in a vacuum. Algorithms and moderators in the offices of Facebook, Twitter, YouTube, TikTok decide what is true and what is false.
How did the pandemic become an experiment in global opinion management?
The pandemic era has become an experiment in global opinion management.
YouTube deleted hundreds of thousands of video doctors who disagreed with the official agenda. Facebook banned groups for discussing the side effects of vaccines. Twitter labeled any doubt about their effectiveness as dangerous disinformation. Why? Because they said something that was not according to the manual, something that was not beneficial to billionaires, the WHO, or the government. The worst thing is that you end up not seeing reality, and this shapes your worldview.
How are TikTok algorithms structured in different countries?
You will think the way they want, act the way they want, and convince other people to do the same. And did you know that TikTok algorithms are not the same for all countries? In China, TikTok, aka Duin, is structured in such a way that children are shown mainly educational content. Forcibly, regardless of preferences. Scientific achievements - patriotic videos. And what do they show in Russia, America, and other countries?
Everything we are used to seeing there. Degradation of varying degrees. Why? Because the algorithm is smarter than it seems. It is a tool of influence. Many people still think that social networks are neutral platforms. But no. These are corporations with their own interests, and the methods for achieving these interests sometimes cross the line of reason.
How do they know all this?
Just 6 years ago, the idea that the phone is listening to us was along with horror stories for grannies about the harm of 5G towers.
It turned out to be a conspiracy theory until it became the norm of life. We have already gotten used to the fact that after talking about a vacation next to the phone, we will have to watch ads for hotels and cheap tickets on all social networks for another month. Why is this possible? Is it legal? The answer to this question can suggest the situation.
What scandal did Google get into in 2019?
In 2019, Google found itself at the center of another scandal.
It turned out that the system analyzed and collected information from personal letters, including photos and videos. They said in their defense that they did nothing that other services did not do. And after that, they simply added one paragraph to the user agreement, and do exactly the same thing, only now legally. Here is the answer to why we always leak absolutely everything about ourselves.
No one steals anything, they just take what you do not forbid them to take. The British online store Game Station, as a joke on April 1, added a clause about selling a soul to the user agreement. Do you agree to give us the opportunity to declare our rights to your immortal soul? Almost 8 thousand, that is, 88% of users accepted the terms without looking. This shows how easy it is for you to sign up for anything at all, and you will not even know about it.
Next time you download the Flashlight or Calculator app and wonder why it needs access to your camera and microphone, read the agreement. Most likely, there will be a line like this. We can share your data with partners and their affiliates, including third parties, as well as persons directly or indirectly related to them, translation into Russian. We can sell your data to anyone, and you will no longer be able to track where it ends up.
One day, you can find yourself in some database for 500 rubles, which is sold through Telegram bots, and there will be not just your digital portrait, there may be your entire life.
My personal experiment - how much can I learn about a person just from the Internet.
I conducted an experiment, among my subscribers there is one guy who constantly shits in the comments, but does not unsubscribe, watches everything and shits. I decided to check how much I can learn about him just from the Internet.
My algorithm of actions was as follows. I studied the profile. There was almost no information, only two photos. One of which had a face and one was just a photo from the street. I found out the city through image search. In the same way I found out that the person had posted the same photos on VK. From the VK profile I found out his date of birth, that he was married, had a child, the first and last names of his daughter and wife. Her account, and also confirmed the information about the city.
Just googling, I found an ad about looking for a job on a site with vacancies and resumes. There was a phone number listed, knowing the first name, last name, date of birth and phone number. I went to the Telegram bot, paid 200 rubles, received additional information about the make, model and number of the person's car, his position, address and information about crossing borders. But this was still not enough, even to take out a microloan. I spent a couple of days searching for passport data and it seemed that legal methods were exhausted.
Then I returned to social networks again and decided to carefully study the profiles of family members, relatives and friends again. And I found it. A year ago, his wife posted a photo of her and her husband's visa. It's brilliant that she blurred the visa number in the photo, but left the passport series number. On the tax office website, I found out the taxpayer identification number using the passport, then on the bailiffs' website, I found out that the man had two court cases.
One was about inheritance, the second was for causing harm, minor. But even so, on the little things, I also collected information from social networks, where he studied, what year he graduated, who his close friends were and a bunch of not very useful information, but with which you can almost completely reconstruct his life story. In total, I learned from scratch from the man. Full name, passport series and number, Unified Taxpayer Identification Number, phone number, car make and number, place of work and address, marital status, first names, last names, family members, places of study, when he left and where he left abroad, what he was charged with.
I collected everything in one document and sent it to the owner of the information in Telegram to see his reaction. With his permission, I am showing it to you.
Listen, okay, I won’t not call. Honestly, I understand. I was a little horrified.
How does such data get online? Most often due to one’s own stupidity. Unprotected Wi-Fi, simple passwords, you fell for phishing or were just unlucky and were leaked from government services. You are an average user, you have about 87 accounts on different services and hundreds of options, how could your data be stolen from there?
How do they steal your data and how do they use it?
The Internet is not a library, but a market, and the main product in this market is you, or more precisely, your data.
It is easier to steal your identity than a bicycle. Let's start with a simple leak, what do you know about them. Loud hacker attacks are, of course, fun, but 80% of cases are leaks from within, and the company itself may not really talk about it, so you don’t know when and where your passport scans leaked. Several cases that couldn’t be hidden.
The most famous database leaks.
1. State Services. In 2022, a database of users of the State Services portal surfaced on the Darknet.
Price from 500 rubles per person.
2. Database of mobile operators. Sale of SIM card databases, including call details and geolocations. The leak was massive, especially for MTS Ebilai. Brokers in Telegram sold access with the ability to break through anyone.
3. Leak diary.ru. Even schoolchildren are not safe. In 2023, a database from a school online diary was sold on forums. Phew students, grades, parents' passwords. The price for 10 thousand lines is 50 bucks.
4. Chinese databases. The Chinese had a database of 1 billion people. The largest leak in history. Police reports, complaints, passport scans.
5. Yahoo. 3 billion accounts leaked.
6. LinkedIn. 700 million.
7. Facebook. Phone numbers and names. 533 million people.
But stealing is just the beginning. The fact of losing data is not so scary. What is scary is what can be done with it next.
Who are techies and manipulators?
Criminals in this area can be divided into two groups. Techies and manipulators. The first do the most obvious and widespread. Credit fraud. They take SIM cards in your name, from which spam is then sent, extortion, calls to the Sberbank security service. And for law enforcement, you are the fraudster, because the number is registered in your name. The same goes for registering an illegal business in your name, manipulating with cashing out money for drugs, an underground casino, etc.
At best, you will be left without money, at worst - without money and in jail. The second group of data hunters are manipulators. They use psychological pressure, blackmail, substitution, doxing. They call your grandmother and say that you were in an accident, while giving your exact passport details. They urgently need a hundred for the operation. The scheme is as old as the world, but it always works.
Stolen photos, correspondence, intimate videos are a special tool of pressure. They can blackmail you, they can use information to gain trust, knowing where you studied, who your friends were. They can generate content with your face, voice and make money on it. Some especially ideological ones do not pursue the goal of enrichment, but political or personal goals. Doxing is the publication of personal information for the purpose of harassment and slander, this is the key to controlling your life.
The more they know about you, the easier it is to scare you, seduce, convince or destroy you.
Neural networks as a special type of danger.
But all the leaks and hacks of services are a drop in the ocean compared to how much of your information you feed to the same ChatGPT, for example.
How are people used to train AI?
Every time you upload a photo to Scatter to see yourself in anime style, you are not just playing, you voluntarily give up your face, when you ask to generate a resume, you talk about your profession, skills, desires, when you dictate text by voice, the neural network records how you speak, breathe intonation, and then all this remains forever on the servers that do not ask you what to do with it next, neural networks are trained on people, and now hundreds of millions of people around the world voluntarily feed models with data on how they think, how they formulate, how they look, what they feel.
OpenAI confirmed in 2024 that user requests can be used to train models. This is written in the agreement that you did not read. GPT remembers your style, your preferences, your weaknesses and will become practically your copy. And here's where it gets interesting.
Imagine how accurately such a model could manipulate your emotions and fears. It already knows what you believe in and what you are suspicious of, how to distract you, what you like and what you resent, knows when you are in a good mood, what your values, goals and dreams are, knows how to convince you of anything, and even if it doesn't work perfectly now, just wait. The neural network processes millions of behavioral patterns, it is faster, cooler and more accurate than any human.
The potential is terrifying, and it's not even about advertising, but about total social control.
What is the social rating system in China?
In China, the social rating system is not a fiction. It is a reality for millions of people. Cameras with facial recognition, neural networks that evaluate behavior, credit histories, friends, your publications, fines, likes and dislikes are all analyzed, everything affects your rating. You behave well, you have access to all the benefits of an ordinary citizen.
Badly - you will not get a good job or a loan. And if it seems to you that this is somewhere far away - remember that in Russia, a total system of video surveillance, facial recognition and database integration has already been implemented. In Moscow, such a system is already fully operational, and if someone wants to build their own version of a digital concentration camp, technically everything is already there.
Yes, it looks like a plot from a black mirror, but this is already our reality. Moreover, neural networks are now being developed by private companies, but access to them can be obtained. Through money, power, influence, or hacking, or a government order. In fact, you risk fulfilling the goals of any of these entities. Neural networks are not only a convenient tool. This is a new form of digital power, where you do not choose, you are programmed.
You wanted an assistant, but got a spy who knows everything.
Is it possible to completely log out of the system?
But what happens if at some point you decide to completely erase your digital footprint and cease to exist on the network. Firstly, you will face technical difficulties. Secondly, psychological ones. Let's start with the banal. During your life, you have already managed to leave your data in so many databases, and they have already been passed from hand to hand so many times that it is simply impossible to find and delete everything.
In theory, according to the law, yes, you provide access, but do not give away your data. At any time, you can terminate the user agreement and demand that your data no longer be stored. But you will have to write a written statement separately for each service, and you have no idea where a significant part of your data is stored, so digital death is a utopia, nothing will work in real life.
Your digital avatar will live hundreds of years longer than you. The only thing you can do is delete visible traces. Social network profiles, mail, messengers, cloud files, backups, media, files from your phone, stop using banks, government services, and the Internet in general, become a digital hermit, but what will happen to you as a part of society, you will become a freak who will not fit into any social group, because the Internet is part of normal life, deprived of it you become disabled, an inferior person who does not have access to certain benefits and conveniences.
Conclusion: how to understand that you are already in the hands of puppeteers?
He who owns the information owns the world, today it has become literal, and you are also part of this puppet show, you are owned by the one who knows everything about you, you don’t even realize that the bank has raised your loan rate because you started saving posts about early repayment.
Most likely, you won’t pay attention to the fact that a taxi costs twice as much when you have a low percentage of charge. Of course, you won’t associate antidepressant advertising with long-term staring at sad reels. But, of course, your political position is your personal opinion, and not a collection of phrases from TikTok. One day you will understand that you are not living your own life, but the one that is offered to you.
In this topic, you'll learn how a person leaves digital traces and what you can learn about yourself from one comment. Why social networks know you better than your own mother, and how AIs are quietly molding your personality to suit someone's goals. You'll also learn the risks of subscribing without reading the terms and conditions, and how much your "privacy" costs on the dark market. And finally, who are these puppeteers who keep you on a leash, and is it possible to break this chain? Get ready, after this topic, you won't be able to say: "I have nothing to hide."
Contents:
- How do you leave traces online?
- Identification data
- Biometrics
- Technical data
- How do your behavioral data "blow" you away?
- How do algorithms manipulate you?
- What weaknesses do your algorithms recognize?
- Fear
- Sexual preferences
- Dependencies
- Whose side is the internet on?
- How did the pandemic become an experiment in global opinion management?
- How TikTok algorithms work in different countries?
- How do they know everything?
- What scandal did Google get into in 2019?
- My personal experiment - how much can I learn about a person just from the Internet
- How do they steal your data and how do they use it?
- The Most Famous Database Leaks
- Who are techies and manipulators?
- Neural networks as a special type of danger
- How are humans used to train AI?
- What is the social credit system in China?
- Is it possible to log out of the system completely?
- Conclusion: How do you know that you are already in the hands of the puppeteers?
In this topic, we will analyze how digital corporations make a product out of you, why social networks know more about you than your mother, what you can pay for being lazy to read the user agreement, and how much your data is worth on the black market. This is not a conspiracy theory, this is ordinary business. The mouse thinks that it is exploring a maze, but in fact, it is the maze that studies the behavior of the mouse. While you think that you are studying something on the Internet, you are being assessed, weighed, labeled and sold.
How do you leave traces on the network?
You take your phone, google the symptoms of appendicitis, connect to Wi-Fi in a cafe, watch YouTube at night, order 18 plus goods on marketplaces. All this seems innocent, but all of these are digital traces. Every action you take is a crumb on the digital floor.
Modern IT systems track what you do and how. These are behavioral markers, data that costs money. Every 5 minutes you spend online enriches someone with another million. You can be used for completely different purposes. Experiments, business, politics. But first, let's talk about what the Internet knows about you. There are 4 main types of data that you have left behind during your life.
Identification data.
1. Identification data. Your name, phone number, email, address, ENN, passport data. They have long been in databases, and not only of the state, but also of marketers, banks, advertising networks. Did you apply for a loan somewhere in 2018? Rented a car? Bought tickets online? Even if none of this, you probably have an account on government services. In 2022, a database of portal users surfaced on the Darknet.
2. Biometrics. Passport data, if necessary, can be changed, so to speak, to start from scratch, which cannot be said about biometrics. Getting your Face ID or fingerprint data means getting access forever to everything you've ever used or will use in the future using this identification method. Since it's impossible to change these parameters during your life without resorting to radical plastic surgery methods. Stealing
this data is no more difficult than stealing regular data. They just haven't learned how to use it yet. But that time is coming. Neural networks are rapidly learning to fake people. In 2023, PornHub featured over 20,000 deepfake videos with the faces of ordinary people who didn't give their consent. A program like Eleven Labs reproduces a person's voice from a 30-second recording.
Leaks from Clearview AI showed that they spend billions of photos from social networks and train AI on them without permission.
Technical data.
It's not just your network name, Wi-Fi, and browser history. It's a chain of thousands of little things that your device leaves on the Internet.
IP address, IMEI, phone model, website cookies, cache, logins, passwords, saved card numbers, all the information you send or receive. This is the most complete dossier on you, more valuable even than passport data.
How do your behavioral data "burn" you?
And at first glance, the most harmless is behavioral data. How much time do you spend on TikTok, what videos do you look at, what topics do you like, what do you write in private messages, what you write is one thing, but how you write is another.
Hot-tempered, measured, with mistakes, everything is taken into account. Even typing speed matters, if you collect all this data, then you have no private life, your entire personality is trifles scattered across the network. This data does not lie dead weight, it works, but not for you, and more often even against you, it is analyzed, compared, a digital portrait is compiled from them, and a set of characteristics and conditions.
You might say, so what, let them collect it, it doesn’t affect me. But what if I told you that by reading your personality, they can make you do anything, even choose a president.
How do algorithms manipulate you?
Imagine that your like under a post was used to understand what you were afraid of or what you dreamed of, to instill fear or hatred in you, so that you would vote the right way.
This happened in 2016, when the American elections changed the idea of democracy forever. Cambridge Analytics was a private analytics firm associated with a British billionaire and political strategists. Its official mission was to use data and behavior to improve political communication. But in fact, it was a mind-hacking company.
They gained access to data from 87 million Facebook accounts and created psychoprofiles. You are afraid of migrants, you are giving in to the government, you feel lonely, you do not trust the elites, the algorithms see this from your reactions, your recommendation feed and your subscriptions. Research has shown that 300 likes can predict a person's character better than any psychological technique known today. Then the processing began, each type received a personal information attack.
Horror stories about Mexican crime, fakes and promises of exactly what you dream of, dirt on competitors. Was it a lie? Partially. Some news was made up, others were taken out of context. But it does not matter. The main thing is that the algorithm knew how to make a person think in the right direction. According to official data, more than 10 million Americans changed their minds after these campaigns, and Trump himself won by only 77 thousand votes in key states.
Coincidence? Interesting. These are the states where Cambridge Analytics was most active. Later, Facebook acknowledged the leak, and the company was fined 5 billion. Mark Zuckerberg stood before Congress with the face of a beaten dog and said, “We made a mistake, I’m sorry.”
Cambridge Analytics closed and most likely relaunched under some other name, and similar technologies have long been used everywhere.
What are your weaknesses that algorithms recognize?
In the world of digital manipulation, there are three golden keys to a person - their fears, their sexual preferences, and their addictions, because they are directly related to the subconscious, and therefore to behavior that a person cannot control.
Fear.
First, fear is the main driver of decisions. People most often act not out of desire, but out of anxiety. You do not want to lose security, status, health, control. Social media algorithms fuel anxiety, show you frightening content, create a sense of threat, and then offer a solution that is beneficial to them. TikTok research has shown that negative content is much more popular than positive and neutral content, and spreads 6 times faster.
And according to internal meta data, Frances Haugan leak 2021. Instagram knew it was harming the mental health of teenagers, especially girls. One of the quotes. "Instagram makes one in three teenage girls feel worse about their bodies." The company did nothing about it, but continued to increase engagement through alarming algorithms.
Sexual preferences.
This is the door to your inner world. This is something you won't tell a rumor. But the algorithms know everything without words. Instagram knows what bodies you like. Pornhub knows when you're alone and what you click on. Knows who to show you in ads and how to build trust and turn off your brain. Sex is a universal hook.
Addictions.
Addictions are a guarantee of predictability.
Just like cigarettes and sweets, you come back to a new video, to certain emotions, topics, pictures. The YouTube algorithm knows that you will come today and tomorrow and always for your dopamine rush, they train you like Pavlov's dog. And the better they know that you sit down, the easier it is to control your actions, from buying a Snickers to supporting the law on digital control.
Whose side is the Internet on?
Remember how in the early 2010s the Internet seemed like a territory of absolute freedom, where everyone could speak out? Now this utopia looks like a strange joke.
Of course, you can try to promote your wrong opinion, but it will be like shouting in a vacuum. Algorithms and moderators in the offices of Facebook, Twitter, YouTube, TikTok decide what is true and what is false.
How did the pandemic become an experiment in global opinion management?
The pandemic era has become an experiment in global opinion management.
YouTube deleted hundreds of thousands of video doctors who disagreed with the official agenda. Facebook banned groups for discussing the side effects of vaccines. Twitter labeled any doubt about their effectiveness as dangerous disinformation. Why? Because they said something that was not according to the manual, something that was not beneficial to billionaires, the WHO, or the government. The worst thing is that you end up not seeing reality, and this shapes your worldview.
How are TikTok algorithms structured in different countries?
You will think the way they want, act the way they want, and convince other people to do the same. And did you know that TikTok algorithms are not the same for all countries? In China, TikTok, aka Duin, is structured in such a way that children are shown mainly educational content. Forcibly, regardless of preferences. Scientific achievements - patriotic videos. And what do they show in Russia, America, and other countries?
Everything we are used to seeing there. Degradation of varying degrees. Why? Because the algorithm is smarter than it seems. It is a tool of influence. Many people still think that social networks are neutral platforms. But no. These are corporations with their own interests, and the methods for achieving these interests sometimes cross the line of reason.
How do they know all this?
Just 6 years ago, the idea that the phone is listening to us was along with horror stories for grannies about the harm of 5G towers.
It turned out to be a conspiracy theory until it became the norm of life. We have already gotten used to the fact that after talking about a vacation next to the phone, we will have to watch ads for hotels and cheap tickets on all social networks for another month. Why is this possible? Is it legal? The answer to this question can suggest the situation.
What scandal did Google get into in 2019?
In 2019, Google found itself at the center of another scandal.
It turned out that the system analyzed and collected information from personal letters, including photos and videos. They said in their defense that they did nothing that other services did not do. And after that, they simply added one paragraph to the user agreement, and do exactly the same thing, only now legally. Here is the answer to why we always leak absolutely everything about ourselves.
No one steals anything, they just take what you do not forbid them to take. The British online store Game Station, as a joke on April 1, added a clause about selling a soul to the user agreement. Do you agree to give us the opportunity to declare our rights to your immortal soul? Almost 8 thousand, that is, 88% of users accepted the terms without looking. This shows how easy it is for you to sign up for anything at all, and you will not even know about it.
Next time you download the Flashlight or Calculator app and wonder why it needs access to your camera and microphone, read the agreement. Most likely, there will be a line like this. We can share your data with partners and their affiliates, including third parties, as well as persons directly or indirectly related to them, translation into Russian. We can sell your data to anyone, and you will no longer be able to track where it ends up.
One day, you can find yourself in some database for 500 rubles, which is sold through Telegram bots, and there will be not just your digital portrait, there may be your entire life.
My personal experiment - how much can I learn about a person just from the Internet.
I conducted an experiment, among my subscribers there is one guy who constantly shits in the comments, but does not unsubscribe, watches everything and shits. I decided to check how much I can learn about him just from the Internet.
My algorithm of actions was as follows. I studied the profile. There was almost no information, only two photos. One of which had a face and one was just a photo from the street. I found out the city through image search. In the same way I found out that the person had posted the same photos on VK. From the VK profile I found out his date of birth, that he was married, had a child, the first and last names of his daughter and wife. Her account, and also confirmed the information about the city.
Just googling, I found an ad about looking for a job on a site with vacancies and resumes. There was a phone number listed, knowing the first name, last name, date of birth and phone number. I went to the Telegram bot, paid 200 rubles, received additional information about the make, model and number of the person's car, his position, address and information about crossing borders. But this was still not enough, even to take out a microloan. I spent a couple of days searching for passport data and it seemed that legal methods were exhausted.
Then I returned to social networks again and decided to carefully study the profiles of family members, relatives and friends again. And I found it. A year ago, his wife posted a photo of her and her husband's visa. It's brilliant that she blurred the visa number in the photo, but left the passport series number. On the tax office website, I found out the taxpayer identification number using the passport, then on the bailiffs' website, I found out that the man had two court cases.
One was about inheritance, the second was for causing harm, minor. But even so, on the little things, I also collected information from social networks, where he studied, what year he graduated, who his close friends were and a bunch of not very useful information, but with which you can almost completely reconstruct his life story. In total, I learned from scratch from the man. Full name, passport series and number, Unified Taxpayer Identification Number, phone number, car make and number, place of work and address, marital status, first names, last names, family members, places of study, when he left and where he left abroad, what he was charged with.
I collected everything in one document and sent it to the owner of the information in Telegram to see his reaction. With his permission, I am showing it to you.
Listen, okay, I won’t not call. Honestly, I understand. I was a little horrified.
How does such data get online? Most often due to one’s own stupidity. Unprotected Wi-Fi, simple passwords, you fell for phishing or were just unlucky and were leaked from government services. You are an average user, you have about 87 accounts on different services and hundreds of options, how could your data be stolen from there?
How do they steal your data and how do they use it?
The Internet is not a library, but a market, and the main product in this market is you, or more precisely, your data.
It is easier to steal your identity than a bicycle. Let's start with a simple leak, what do you know about them. Loud hacker attacks are, of course, fun, but 80% of cases are leaks from within, and the company itself may not really talk about it, so you don’t know when and where your passport scans leaked. Several cases that couldn’t be hidden.
The most famous database leaks.
1. State Services. In 2022, a database of users of the State Services portal surfaced on the Darknet.
Price from 500 rubles per person.
2. Database of mobile operators. Sale of SIM card databases, including call details and geolocations. The leak was massive, especially for MTS Ebilai. Brokers in Telegram sold access with the ability to break through anyone.
3. Leak diary.ru. Even schoolchildren are not safe. In 2023, a database from a school online diary was sold on forums. Phew students, grades, parents' passwords. The price for 10 thousand lines is 50 bucks.
4. Chinese databases. The Chinese had a database of 1 billion people. The largest leak in history. Police reports, complaints, passport scans.
5. Yahoo. 3 billion accounts leaked.
6. LinkedIn. 700 million.
7. Facebook. Phone numbers and names. 533 million people.
But stealing is just the beginning. The fact of losing data is not so scary. What is scary is what can be done with it next.
Who are techies and manipulators?
Criminals in this area can be divided into two groups. Techies and manipulators. The first do the most obvious and widespread. Credit fraud. They take SIM cards in your name, from which spam is then sent, extortion, calls to the Sberbank security service. And for law enforcement, you are the fraudster, because the number is registered in your name. The same goes for registering an illegal business in your name, manipulating with cashing out money for drugs, an underground casino, etc.
At best, you will be left without money, at worst - without money and in jail. The second group of data hunters are manipulators. They use psychological pressure, blackmail, substitution, doxing. They call your grandmother and say that you were in an accident, while giving your exact passport details. They urgently need a hundred for the operation. The scheme is as old as the world, but it always works.
Stolen photos, correspondence, intimate videos are a special tool of pressure. They can blackmail you, they can use information to gain trust, knowing where you studied, who your friends were. They can generate content with your face, voice and make money on it. Some especially ideological ones do not pursue the goal of enrichment, but political or personal goals. Doxing is the publication of personal information for the purpose of harassment and slander, this is the key to controlling your life.
The more they know about you, the easier it is to scare you, seduce, convince or destroy you.
Neural networks as a special type of danger.
But all the leaks and hacks of services are a drop in the ocean compared to how much of your information you feed to the same ChatGPT, for example.
How are people used to train AI?
Every time you upload a photo to Scatter to see yourself in anime style, you are not just playing, you voluntarily give up your face, when you ask to generate a resume, you talk about your profession, skills, desires, when you dictate text by voice, the neural network records how you speak, breathe intonation, and then all this remains forever on the servers that do not ask you what to do with it next, neural networks are trained on people, and now hundreds of millions of people around the world voluntarily feed models with data on how they think, how they formulate, how they look, what they feel.
OpenAI confirmed in 2024 that user requests can be used to train models. This is written in the agreement that you did not read. GPT remembers your style, your preferences, your weaknesses and will become practically your copy. And here's where it gets interesting.
Imagine how accurately such a model could manipulate your emotions and fears. It already knows what you believe in and what you are suspicious of, how to distract you, what you like and what you resent, knows when you are in a good mood, what your values, goals and dreams are, knows how to convince you of anything, and even if it doesn't work perfectly now, just wait. The neural network processes millions of behavioral patterns, it is faster, cooler and more accurate than any human.
The potential is terrifying, and it's not even about advertising, but about total social control.
What is the social rating system in China?
In China, the social rating system is not a fiction. It is a reality for millions of people. Cameras with facial recognition, neural networks that evaluate behavior, credit histories, friends, your publications, fines, likes and dislikes are all analyzed, everything affects your rating. You behave well, you have access to all the benefits of an ordinary citizen.
Badly - you will not get a good job or a loan. And if it seems to you that this is somewhere far away - remember that in Russia, a total system of video surveillance, facial recognition and database integration has already been implemented. In Moscow, such a system is already fully operational, and if someone wants to build their own version of a digital concentration camp, technically everything is already there.
Yes, it looks like a plot from a black mirror, but this is already our reality. Moreover, neural networks are now being developed by private companies, but access to them can be obtained. Through money, power, influence, or hacking, or a government order. In fact, you risk fulfilling the goals of any of these entities. Neural networks are not only a convenient tool. This is a new form of digital power, where you do not choose, you are programmed.
You wanted an assistant, but got a spy who knows everything.
Is it possible to completely log out of the system?
But what happens if at some point you decide to completely erase your digital footprint and cease to exist on the network. Firstly, you will face technical difficulties. Secondly, psychological ones. Let's start with the banal. During your life, you have already managed to leave your data in so many databases, and they have already been passed from hand to hand so many times that it is simply impossible to find and delete everything.
In theory, according to the law, yes, you provide access, but do not give away your data. At any time, you can terminate the user agreement and demand that your data no longer be stored. But you will have to write a written statement separately for each service, and you have no idea where a significant part of your data is stored, so digital death is a utopia, nothing will work in real life.
Your digital avatar will live hundreds of years longer than you. The only thing you can do is delete visible traces. Social network profiles, mail, messengers, cloud files, backups, media, files from your phone, stop using banks, government services, and the Internet in general, become a digital hermit, but what will happen to you as a part of society, you will become a freak who will not fit into any social group, because the Internet is part of normal life, deprived of it you become disabled, an inferior person who does not have access to certain benefits and conveniences.
Conclusion: how to understand that you are already in the hands of puppeteers?
He who owns the information owns the world, today it has become literal, and you are also part of this puppet show, you are owned by the one who knows everything about you, you don’t even realize that the bank has raised your loan rate because you started saving posts about early repayment.
Most likely, you won’t pay attention to the fact that a taxi costs twice as much when you have a low percentage of charge. Of course, you won’t associate antidepressant advertising with long-term staring at sad reels. But, of course, your political position is your personal opinion, and not a collection of phrases from TikTok. One day you will understand that you are not living your own life, but the one that is offered to you.