How Facebook (and beyond) manipulates you when choosing your privacy

Tomcat

Professional
Messages
2,379
Reputation
4
Reaction score
407
Points
83
a3bd413b150dd651ca77b.png


In 2010, the Electronic Frontier Foundation was fed up with Facebook's intrusive interface. The company forced people to give up their privacy more and more. The question is, what should we call this compulsion? Zuckermining? Facebaiting? Zuckerpunch? The title that eventually caught on was Privacy Zuckering, or "you are being tricked into publicly sharing more information about yourself than you originally intended."

In ten years, Facebook has gone through enough scandals and saw that people are worried about this manipulation. He even paid a $ 5 billion fine last year for "making false claims about consumers' ability to control the privacy of their personal data." Yet researchers have found that Privacy Zuckering and other shady tactics are alive and well on the internet, especially on social media, where privacy management is more confusing than anywhere else.

Back in 2010, Facebook used a trick when it allowed users to “opt out” of platform partner sites collecting their public information on social media. Anyone who refused saw a pop-up asking, “Are you sure? Allowing instant personalization will give you more options when you browse the web. ”Until recently, Facebook also warned people against abandoning facial recognition:" If you turn off facial recognition, we won't be able to use this technology when a stranger uses your photo to impersonate you. "The setting enable button is bright and blue , while the off button is gray, less eye catching.

Researchers call these design and language decisions "dark patterns" that try to manipulate your choices. Instagram pesters you "please turn on notifications" and does not provide an option to opt out? This is a dark pattern. LinkedIn shows you part of the message in an email, but forces you to visit the platform to read more? He is also. Does Facebook redirect to "log out" when you try to deactivate your account? Dark pattern again.

There are dark patterns all over the Internet, they encourage people to subscribe to information or services, to purchase goods. Colin Gray, a human-computer interaction researcher at Purdue University, has been studying dark patterns since 2015. He and his team identified five main types:

- whining;
- obstacles;
- concealment;
- interference with the interface;
- coercion.

These games are not limited to social media. They have spread all over the internet, especially since the introduction of the General Data Protection Regulation (GDPR). Since the GDPR went into effect in 2018, sites have been required to ask people for consent to collect certain types of data. But some banners simply ask you to accept the privacy policy - without the option to say no. "Some studies have shown that over 70% of EU consensus banners have some kind of dark pattern embedded in them," says Gray.

Last year, US Senators Mark Warner and Deb Fischer introduced legislation that would ban these "manipulative user interfaces." The problem is that it becomes very difficult to define the dark pattern. "Any design has a certain degree of persuasion," says Victor Yokko, author of Design for the Mind: Seven Psychological Principles of Persuasive Design.

By definition, design encourages the use of the product in some way, which is not bad in the first place. It's bad if the design is designed to cheat.

Gray also found it difficult to distinguish between dark patterns and just plain bad design. He even created a framework for defining the latter. So, bad design deprives the user of choice and pushes him to a decision that benefits not him, but the company. Designers use strategies such as misrepresentation, bargaining, and duplicity (advertising an ad blocker that also contains ads).

Gray cites the example of the Trivia Crack smartphone app that forces its users to play a different game every two to three hours. These spam notifications have been used by social media for years to trigger the kind of loss of profits syndrome that keeps you hooked. “We know that if we give people things like swipes or status updates, then people are more likely to come back,” says Yokko. "It can also lead to compulsive behavior."

The darkest scenarios are when people try to leave these platforms altogether. Try deactivating your Instagram account and you will find it extremely difficult. First, you can't even make this app. On the PC version of the site, this setting is hidden inside the “Edit Profile” and comes with a series of questions: “Why are you disconnecting? Too distracting? Try turning off notifications here. Just need a break? Log out instead of leaving altogether ").

“It creates obstacles in your path, making it harder for you to make up your mind,” says Natalie Nahai, author of The Web of Influence: The Psychology of Online Persuasion. Years ago, after deleting her Facebook account, she discovered a similar set of manipulative tactics. The social network showed pictures of some of her close friends. “They use language that I think is coercive,” says Nahai. "It hurts you psychologically to leave because of him."

Even worse, Gray says, research shows that most people don't even know they are being manipulated.

But according to one study, "when people were prepared in advance to manipulate, twice as many users were able to recognize dark patterns." At least there is some hope that more awareness can bring back user control at least partially.
 
Top