Cognitive biases

Teacher

Professional
Messages
2,672
Reputation
9
Reaction score
695
Points
113
166eb24879d9b2b521b63.png


Systematic errors in human thinking, a kind of logical trap. In certain situations, we tend to act in irrational patterns, even when it seems to us that we are proceeding from common sense.

Illusion of control

People tend to overestimate their influence on events in which they are interested in a successful outcome. This phenomenon was discovered in 1975 by the American psychologist Ellen Langer during experiments with lottery tickets. The participants in the experiment were divided into two groups: people from the first group could choose their own lottery tickets, and the members of the second group were given out without the right to choose. 2 days before the drawing, the experimenters suggested that the participants of both groups exchange their ticket for another, in a new lottery with greater chances of winning.

Obviously, the offer was profitable, but those participants who chose the tickets themselves were in no hurry to part with them - as if their personal choice of ticket could affect the likelihood of winning.

Zero risk preference

Imagine that you have a choice: reduce the small risk to zero, or significantly reduce the high risk. For example, to bring plane crashes to zero or drastically reduce the number of car accidents. Which would you choose?

Based on the statistics, it would be more correct to choose the second option: the death rate from plane crashes is much lower than the death rate from car accidents - so in the end, such a choice will save many more human lives. And yet research shows that most people choose the first option: zero risk in any area looks more reassuring, even if your chances of becoming a victim of a plane crash are negligible.

Selective perception

Let's say you don't trust GMOs. And if this topic excites you, you probably read news and articles about genetically modified organisms. As you read, you become more and more convinced that you are right: the danger is present.

But here's the catch - chances are that you are paying much more attention to news stories that support your point of view than arguments in favor of GMOs. This means that you lose objectivity. This tendency of people to pay attention to information that is consistent with their expectations and ignore everything else is called selective perception.

Player error

A gambler's mistake most often lies in wait for gamblers. Many of them try to find a relationship between the probability of the desired outcome of some random event and its previous outcomes.

The simplest example is with a coin toss: if it hits heads nine times in a row, most people will bet on heads next time, as if hitting heads too often increases the likelihood of it hitting. But this is not so: in fact, the odds remain the same - 50/50.

Survivor bias

This logical trap was discovered during the Second World War, but you can fall into it in peacetime. During the war, the US military leadership decided to reduce the number of losses among bombers and issued an order: based on the results of the battles, find out on which parts of the aircraft it is necessary to strengthen the protection. They began to study the returning aircraft and found many holes in the wings and tail - it was decided to strengthen these parts. At first glance, everything looked quite logical - but, fortunately, the observational statistician Abraham Wald came to the aid of the military. And he explained to them that they almost made a fatal mistake. Indeed, the holes in the returning planes carried information about their strengths, and not about their weaknesses. Airplanes "wounded" in other places - for example,

The principle of "wounded survivors" is worth thinking about now, when we are about to jump to conclusions based on asymmetrical information on any two groups.

The illusion of transparency

You are in a situation where it is imperative to lie. But how difficult it is to do it - it seems to you that they see right through you

and any involuntary movement will betray your insincerity. This is the "illusion of transparency" - the tendency of people to overestimate the ability of others to understand their true motives and experiences.

In 1998, psychologists conducted an experiment with students at Cornell University. Individual students read the questions from the cards and answered them by telling the truth or lies, depending on the directions on the card. The audience was asked to determine when the speakers were lying, and the speakers were asked to rate their chances of fooling others. Half of the liars assumed that they would be figured out - but in fact, the listeners exposed only a quarter. This means that the liars greatly overestimated the discernment of their listeners.

Why it happens? Most likely, because we ourselves know too much about ourselves. And therefore we think that our knowledge is obvious to an external observer. However, the illusion of transparency also works in the opposite direction: we overestimate our ability to recognize other people's lies.

Barnum effect

A common situation: a person reads and stumbles upon a horoscope. He, of course, does not believe in all these pseudosciences, but decides to read the horoscope purely for the sake of entertainment. But a strange thing: the characteristic of the sign suitable for him coincides very precisely with his own ideas about himself.

Such things happen even to skeptics: psychologists have called this phenomenon "the Barnum effect" - in honor of the American showman and dexterous manipulator of the 19th century, Finneas Barnum. Most people tend to perceive rather general and vague descriptions as accurate descriptions of their personality. And, of course, the more positive the description,

the more matches. This effect is used by astrologers and fortune-tellers.

Self-fulfilling prophecy effect

Another cognitive bias that works into the hands of diviners. Its essence is that a prophecy that does not reflect the truth, which sounds convincing, can make people involuntarily take steps to implement it. And in the end, the prophecy, which objectively had not so many chances to come true, suddenly turns out to be true.

The classic version of such a prophecy is described in the story of Alexander Green "Scarlet Sails". The inventor Aigle predicts little Assol that when she grows up, the prince will come for her on a ship with scarlet sails. Assol fervently believes in the prediction and the whole city becomes aware of it. And then Captain Gray, who fell in love with the girl, learns about the prophecy and decides to make Assol's dream come true. And in the end, Egle turns out to be right, although the happy ending in history was provided by far from fabulous mechanisms.

Fundamental attribution error

We tend to explain the behavior of other people by their personal qualities, and our actions - by objective circumstances, especially when it comes to some mistakes. For example, another person is probably late because of his lack of punctuality, and his lateness can always be explained by a ruined alarm clock or traffic jams. Moreover, we are talking not only about official excuses, but also about an internal vision of the situation - and this approach to business prevents us from taking responsibility for our actions. So those looking to improve themselves should be aware of the fundamental attribution error.

The effect of moral trust

The journalist known for his liberal views fell for homophobia, the priest took a bribe, and the senator, who stands up for family values, was photographed in a strip bar. In these seemingly out of the ordinary cases, there is a sad pattern - it is called the "effect of moral trust." If a person develops a solid reputation as a “righteous man,” at some point he may have the illusion that he is truly sinless. And if he's so good, then a little weakness won't change anything.

A cascade of available information

Collective belief in an idea becomes much more persuasive when the idea is repeated in public discourse. We often encounter him in conversations with grandmothers: many pensioners are confident in the truthfulness of everything that is often talked about on television. But the new generation is likely to feel this effect through Facebook.
 

Brother

Professional
Messages
2,565
Reputation
3
Reaction score
363
Points
83

Cognitive Biases That Affect Making Good Decisions​


fabf7e419bae50fe5e88c.png


1. Anchor effect
People are overly dependent on the first information they hear. In a salary negotiation, whoever proposes first sets a series of probabilities in the other person's mind.

2. Availability heuristic
People overestimate the importance of the information that is available to them. A person may argue that smoking is not harmful to health because he knows someone who has lived to 100 and smoked three packs a day.

3. Conformity
Conformity is the property of group members to give up their opinions in favor of the group. The more members of the group, the stronger the effect of conformity. The more people in the group, the lower the psychological age of the group.
The likelihood that a person will make a decision increases if the majority of the group members adhere to this decision. It is because of her that most meetings are not so productive.

4. Blind spot effect
Failure to admit that you have cognitive distortions is also a cognitive distortion. People are more likely to notice erroneous behavior and motivations in others than they are in themselves.

5. Effect of the choice made
When you choose something, you tend to feel positive, even if there are objective flaws in your choice. For example, you think your dog is cool - even if it bites people a lot.

6. The illusion of clustering
It is the tendency to see patterns in random events. This is the key to various misconceptions associated with gambling, for example, the idea that on roulette, red appears more often after a pair of reds.

7. Prejudice
We tend to listen to the information that confirms our point of view, and not notice the one that refutes it. This is one of the many reasons why it is so difficult to have an intelligent conversation about climate change.

8. Conservative thinking
People prefer to believe previous evidence rather than new, old information versus new. People did not immediately accept the fact that the Earth is round, because did not want to abandon the earlier version of its flat form.

9. Distortion by unnecessary information
The tendency to seek information when it no longer influences decision making. More information isn't always better. By knowing less, people are more likely to make better predictions. The decision to ignore dangerous or negative information by burying your head in the sand like an ostrich. Research shows that in bad markets, investors check the value of their assets much less frequently.

10. Bias to result
Judge a decision based on the outcome, not how accurately the decision was made at the moment. Just because you won a lot in Vegas doesn't mean that gambling was a smart decision.

11. Overconfidence
Some of us are overly confident in our abilities and this forces us to take risks in our daily life. For example, experts are more prone to this effect than ordinary people.

12. Placebo effect.
The simple belief that something is affecting you because it has that effect. An example from medicine: fake pills, pacifiers, often have the same effect on people as real ones.

13. Effect of innovation
When the author overestimates the importance and usefulness of the innovation for the consumer. At the same time, the inventor is sure that it will take years for competitors to copy his innovative development. The Chinese don't think so.

14. Illusion of novelty
People perceive the latest information as true, while ignoring the old data. Investors often think that if the market goes up, it will always go up that way.

15. Salience
The tendency to focus only on easily recognizable features or traits. For example, when you think about death, you are more worried about dying in the mouth of a lion, although statistically it is more likely that you will die under the wheels of a car.

16. Selective perception
The tendency to allow our expectations to influence how we perceive the world. For example, during a football match, players notice more violations on the opposing team than on their own.

17. Typing error
The expectation that a person (group) unknown to us will possess certain qualities. This allows us to make decisions quickly, while the reliability of these decisions leaves much to be desired.

18. Survivor's mistake
An error that occurs due to the fact that we focus only on the information that has come down to us. We may think that being an entrepreneur is easy because we haven't heard of those who fail and go broke.

19. Zero risk
People prefer certainty, even if it's counterproductive. For example, they prefer the quiet, low-paying job of a performer to the high-paying, nervous job of a leader.
 
Top