Teacher
Professional
- Messages
- 2,670
- Reaction score
- 806
- Points
- 113
Hello, cyberstalkers! Hello, carders. Even if you are careful to protect your data, this will not give you the privacy you want. And the further you go, the more opportunities companies and governments have to collect and use a wide variety of information, and identifying a person from it is not a problem at all. Using a few interesting examples, we will analyze how this happens and what it can potentially lead to.
Go:
The same panopticon
We will start not with modernity, but with the eighteenth century, from which the word "panopticon" came to us. Initially, it was invented by the inventor of a rather strange thing — a prison in which the warden could constantly see each prisoner, and the prisoners could not see either each other or the warden.
Sounds like some kind of thought experiment or philosophical concept, right? This is partly true, but the author of this concept, British Jeremy Bentham, was extremely determined in his desire to build a panopticon in reality.
Bentham was obsessed with the idea and spent a lot of time begging the British and other governments to give him money to build a new type of prison. He believed that continuous and inconspicuous surveillance would have a positive effect on the correction of prisoners. In addition, such prisons would be much cheaper to maintain. By the end, he was already ready to become the chief supervisor himself, but he could not raise money for the construction.
Since then, several prisons and other institutions inspired by Bentham's drawings have been built in different countries. But we are, of course, interested in something else in this story. The panopticon almost immediately became synonymous with total surveillance and control for the society. Which seems to be being built right before our eyes.
33 bits of entropy
Researcher Arvind Narayanan has proposed a useful method that helps measure the degree of anonymity. He called it "33 bits of entropy" — that's how much information you need to know about a person in order to distinguish him in a unique way among the entire population of the Earth. If we take a binary attribute for each bit, then 33 bits will just give us a unique match among 6.6 billion.
However, keep in mind that the attributes are usually not binary, and sometimes only a few parameters are sufficient. For example, take a database that stores your zip code, gender, age, and car model. The postal code limits the sample to an average of 20 thousand people — and this is a figure for a very densely populated city: in Moscow, there are 10 thousand people per post office, and on average in Russia — 3.5 thousand.
Gender will limit the selection by half. Age — up to several hundred, if not dozens. And the model of the car is only up to a couple of people, and often up to one. A rarer car or a smaller locality may make half of the parameters unnecessary.
Scripts, cookies, and fun tricks
Instead of the zip code and car model, you can use the browser version, operating system, screen resolution, and other parameters that we leave for each site visited, and advertising networks diligently collect all this and easily track the path, habits, and preferences of individual users.
Or here is another interesting example, already from the world of mobile applications. Every iOS program that gets access to photos (for example, to apply a beautiful effect on them) has the ability to scan the entire database in seconds and suck up the photo metadata. If there are a lot of photos, then it is easy to calculate the route of your movements over the past five years from the geolocation data and the time of shooting. Proof of concept can be found in the Felix Krause repository.
There are also more well-known, but no less unpleasant things. For example, many traffic counters allow site owners, among other things, to record the user's session — that is, to record every mouse movement and click on a button. On the one hand, there is nothing illegal here, on the other hand, it is worth watching such a record with your own eyes, and you understand that something is wrong here and such a mechanism should not exist. And this is not counting the fact that passwords or payment data can leak through it in certain cases.
And there are a lot of such examples — if you are interested in the topic, then you can easily remember something in this spirit. But it is much more difficult to understand what happens to the collected data next. But their life is rich and interesting.
Habit Dealers
Many of the data collected may seem harmless and often impersonal at first glance, but linking it to a person is not as difficult as it seems, and the harmlessness depends on the circumstances. How safe do you think it is to share pedometer data with someone? In particular, they make it easy to understand what time you are not at home and on which days you return late.
However, robbers have not yet reached such high technologies, but advertisers may be interested in any information about your habits. Especially the SMM hellhounds. How often do you go out? On weekdays or weekends? Morning or afternoon? What places? And so on and so forth.
In the United States alone, Newsweek estimates that there are several thousand companies that collect, process and sell databases (industry giants Acxiom and Experian operate with billions of dollars in turnover). Their main product is various kinds of ratings that business owners can buy and use in ready-made form when making decisions.
There is also no shortage of people willing to sell their data. Any businessman who finds out that you can make a little money out of nothing will at least be interested. And don't think that the main evil here is soulless corporations. Startups are almost worse in this regard: today they employ excellent honest people, and tomorrow there may be only a couple of cynical managers who put the last property under the hammer.
The author of Newsweek notes an important point: the data collected by brokers often contain errors and the rating may come out incorrect. So if someone is suddenly denied a loan or does not want to hire, then perhaps they should blame not themselves, but the crazy surveillance system that has spontaneously emerged over the past fifteen years.
Your Shadow profile
In most countries, the center of public life online is Facebook, so conversations about online privacy often revolve around it.
At first glance, it may seem that social networks like Facebook take good care of the safety of personal data and provide a lot of privacy settings. Therefore, tech-savvy people often write off the fears and complaints of less educated friends in this regard. If someone accidentally shared something with the world that they didn't want to share, then it was their own fault.
In reality, Facebook really cares about the safety of personal data, but this is more of a statement like the well-known aphorism in the USSR "We need the world, and if possible the whole". Like many social networks and messengers, Facebook tries to capture and store as much personal data as it can. Sometimes the consequences of this break out.
A recent study by Gizmodo reporter Kashmir Hill is revealing. She was looking for Facebook users who saw someone they shouldn't have seen in the "You may know these people" section, or vice versa-their profile was inexplicably caught by someone unfamiliar.
Among the examples Hill cites:
- a social worker who was suddenly called by a nickname by the client on the second visit, although they did not exchange any data;
- a woman whose father abandoned her family when she was six years old. Facebook advised her to add a lover as a friend, to which he left forty years ago;
- a lawyer who said that he was horrified to delete his account when he met a member of the defense in a recent case among the recommendations, and he only interacted with this person via a work email that was not linked to Facebook.
Deleting your Facebook account is a drastic, but in this case only a limited useful step. Do not doubt that the social network will continue to store all the data and use it to build a hidden social graph.
The reason for all these cases is the same: Facebook's algorithms have successfully linked together pieces of information that someone once left behind. Let's say that one of your friends has only your phone number, and another has only an email address. They both let Facebook scan their address book. This way, the phone number and email address will be associated with each other. And also with all your phone numbers, addresses, instant messenger numbers, nicknames that you have ever used and given to someone.
A similar behavior was observed for "Telegram". Here is the paradox: a messenger that, on the one hand, provides convenient encryption tools, on the other-does not forget to capture everything that it can reach.
What's even more fun is that Facebook already has so much data that even if an individual has never had an account, their profile can still be compiled from the information that other people continuously leave. And "it is possible" in such cases means that this is probably the case. If the social network's algorithms see a human-shaped information hole, they will keep it in mind and add new details.
Facebook employees don't like the phrase "shadow profile"very much. Of course, it's nice to think that they're just collecting data that will help people find old friends, renew lost business contacts, and so on. A little less pleasant is to remember that all this is done in order to sell ads more effectively. And I don't want to constantly imagine cases of blackmail, revenge, or fraud.
The not-too-distant future
Until ten or fifteen years ago, everything we were talking about here either didn't exist or didn't bother most people. Now the public reaction is slowly changing, but most likely not fast enough, and the next ten or fifteen years in terms of privacy promise to be extremely unpleasant (or unprivileged?).
If now it is easiest to follow the Internet, then with the miniaturization and cheapening of electronics, the same problems will gradually reach the world around us. Actually, they are already slowly getting there, but this is just the beginning.
In the article "How low (power) can you go?", writer Charlie Strauss starts from Kumi's law (a variation of Moore's law, which says that the energy efficiency of computers doubles every 18 months) and calculates the minimum power consumption of computers in 2032.
According to Strauss's rough estimates, it turns out that the analog of the current low-power mobile systems on a chip with a set of sensors in fifteen years will be able to be powered by a solar battery with a side of two or three millimeters or from the energy of a radio signal. And this is without taking into account the fact that there may be more efficient batteries or other ways of delivering energy.
Strauss goes on to say: "Let's assume that our hypothetical low-power computer will cost 10 cents apiece for mass production. Then covering London with processors of one per square meter by 2040 will cost 150 million euros, or 20 euros per person." The same amount of money spent by the London authorities in 2007 on removing gum from the roads, so it is unlikely that the city will not be able to afford such a project.
Strauss goes on to discuss the benefits of computerizing urban surfaces, from super — accurate weather forecasts to preventing epidemics. I recommend reading it with all the details, but within the framework of this article, of course, we are interested in something else: the level of total observation that this almost "smart dust" from science fiction can give.
"Wearing a hoodie isn't going to help," Strauss says sarcastically. Indeed, even if ubiquitous sensors don't have cameras, they can still read every RFID tag or identify mobile devices with communication modules. Anyway, why won't there be any cameras? Tiny lenses that can be produced together with the sensor are already being developed in the laboratory.
Crowd Memory
What could potentially result from spraying computers evenly over the surrounding area, Strauss calls the words panopticon singularity-by analogy with the technological singularity, which implies the uncontrolled spread of technology. This is another technogenic horror story, which means widespread surveillance and, as a result, lack of freedom.
One can imagine the extreme consequences if mass surveillance technologies fall into the hands of a dictatorial regime or religious fundamentalists (such as the Russian Federation or ISIS). However, even if you imagine that such a system will be controlled by an ordinary, not particularly corrupt and not too fixated on traditional values, the government still somehow feels uneasy.
Another potential source of total surveillance is people themselves. Here is a brief history of this technology:
- version 1.0-old ladies at the entrance who know everything about everyone;
- version 2.0 — passers-by who just snatch their phones out of their pockets, take photos and post on social networks;
- version 3.0 — always-on cameras on everyone's head.
The reasons for putting a camera on your head can be different — for example, to stream in Periscope or Twitch, use augmented reality applications, or simply record everything that happens so that you can have a digital copy to which you can turn at any time for a memory. The latter application is called "lifelogging".
One of the main theorists of lifelogging is Gordon Bell, a senior researcher at Microsoft Research. In his latest book, Your Life Uploaded, Bell talks about different aspects of society, where everyone has documented evidence of everything that happened. Bell believes that this is a good thing: there will be no crimes, no innocents punished, and people will be able to use the ultra-reliable memory of machines.
What about privacy? Is it possible in the world of the future to do something in secret from others? Bell believes that the system should be built in such a way that at any time you can exclude yourself from public memory. You press the magic button, and the right to watch the recording with you will remain, for example, only with the police. Convenient, isn't it?
This is where you really understand where the road that Facebook has taken with its shadow profiles ends. If everything continues as it is now and people continue to happily upload their data, then the magic button for dropping out of shared memory simply won't affect anything. Well, you excluded yourself, and the intelligent algorithm immediately connected everything back from indirect signs.
What to do
... to avoid a dark future that would make even George Orwell shudder?
We all think that, probably, someone there will somehow, perhaps, take care of this. Activists should say "no", governments should introduce laws, the executive branch should follow up, and so on. Alas, history knows a lot of cases when this scheme did not fail badly.
There are indeed government initiatives — see, for example, the European GDPR - the General Data Protection Regulation. This decree regulates the collection and storage of personal information, and with it Facebook will not get off with a sparing (taking into account the huge turnover) fine of 122 million euros. You will have to pay off 4% of the revenue, that is, at least a billion.
The first stage of the GDPR started on May 25, 2018. Russia already has not the most successful experience with the law "On Personal data". Characteristically, few people are saddened by his modest success ("let Zuckerberg and Page and Brin watch better than Comrade Major").
And in general, there are serious suspicions that governments want not so much to stop surveillance as to attach themselves to it, and ideally-to regain their monopoly. So it would be much better if corporations started to control and regulate themselves. Unfortunately, as we can see on the example of Facebook, some of them work in exactly the opposite direction.
Cyberstalkers, can we do anything on our own other than frantically try to hide, encrypt, obfuscate, and turn everything off? I suggest that you start with the main thing-try never to wave your hand and say: "Ay, everyone is watching anyway!" Not all the same, not all and not with the same consequences. It is important.
I suggest starting with the main thing — try never to wave your hand and say: "Ay, everyone is watching anyway!" Not all the same, not all and not with the same consequences.