Lord777
Professional
- Messages
- 2,579
- Reaction score
- 1,513
- Points
- 113
Our brains are not perfect, and we make mistakes more often than we might think. This is proved by the Nobel laureate in economics Daniel Kahneman. We have collected the most useful theses from his book Think Slowly. Decide quickly."
Why are we wrong
Perceptual errors are inherent in the very nature of man. To explain where they come from, Daniel Kahneman describes two modes of thinking, conventionally calling them System 1 and System 2.
How the 2 systems of our brain interact
What influences our wrong decisions
The problem seems important to us if we remember it with ease. The illusion of truth or importance arises when something seems familiar to us. For example, we have already thought about this issue or heard about it many times. Therefore, it is so easy to believe in an advertisement or news that is being circulated by the media. In turn, the media picks topics that are of general interest and therefore seem important.
If information is perceived easily and quickly, we tend to believe it. Also, our alertness decreases when we are in a good mood. This is due to ancient human instincts and the workings of the amygdola. This is the part of the brain that is responsible for warning of danger. If the situation is comfortable and safe, then our brain receives the appropriate signal. As a result, the general mood improves, and we become less alert and critical of information.
A simple example of an availability heuristic. You learned that there were two plane crashes last month, and therefore decided to continue to travel only by train. In fact, there is a risk in the second case, but your decision was influenced by the availability of data.
Our System 1 loves coherence, that is, cause-and-effect relationships. We try to create coherent stories like in the movies, even if the reality is different. To do this, we turn to System 2, which looks for the missing facts. Therefore, we are very willing to believe in the success stories of celebrities, while not considering their useful connections or resources, their real mistakes and failures.
For the same reason, we evaluate random events as legitimate. For example, after 3-4 successful throws of one athlete, the conviction arises that he has an easy hand and in the future he will play just as successfully. Although hitting the target could be an accident, and the sample (number of throws) is very small, such illusions are quite common.
Framing is one of the manifestations of the principle “What I see is what I see”. This means that when we change the wording, we change our mind. For example, the phrase “disease death rate is 10%” causes us more anxiety than “disease survival rate is 90%”, although the meaning of the statements is the same.
If System 1 does not find a quick answer to a question, it looks for an easier question of a similar nature and answers it.
For example, to the question “How happy are you?” System 1 is pleased to offer us the question “What was my mood lately?”. And if you are asked what is the unemployment rate in the country, it is likely that first of all you will remember whether relatives and friends have a job.
The priming (precedence) effect proves that in many ways we do not think independently and are influenced.
The spoken ideas and words influence your thoughts. For example, if you are asked after the word "food" to name another four-letter word that would begin with "M", you are more likely to say "meat" than "soap."
Ideas can also influence our actions. For example, according to an experiment by Kathleen Vos, the thought of money evokes individualism in people, a desire to act independently, not to ask for help and not to help others.
The binding effect occurs in two cases. We understand in which direction the answer needs to be corrected, and we even find arguments, but we do not know the exact solution. So, a teenager can turn down the music at the request of the parents, but the compromise is not enough for both parties. For a teenager, music still sounds too quiet, for parents - too loud.
In the second case, we rely on previous information (binding via priming). If you are asked whether Leo Tolstoy was over 120 years old at the time of his death, your guess about his age will be many years higher than if the number 50 was named in the question.
What to do to avoid falling for brain tricks
Why are we wrong
Perceptual errors are inherent in the very nature of man. To explain where they come from, Daniel Kahneman describes two modes of thinking, conventionally calling them System 1 and System 2.
How the 2 systems of our brain interact
- System 1 easily processes the data and finds quick answers in memory. We often call this process intuition.
- When System 1 does not have a quick answer, it turns to System 2 for help to work through the issue in detail.
- If System 2 approves of System 1's decision, they become beliefs or actions. However, they remain almost unchanged.
- In this way, our brain tries to conserve its resources. Thinking tricks that help simplify decision making are what Kahneman calls heuristics. These same cognitive tricks can lead to errors.
What influences our wrong decisions
The problem seems important to us if we remember it with ease. The illusion of truth or importance arises when something seems familiar to us. For example, we have already thought about this issue or heard about it many times. Therefore, it is so easy to believe in an advertisement or news that is being circulated by the media. In turn, the media picks topics that are of general interest and therefore seem important.
If information is perceived easily and quickly, we tend to believe it. Also, our alertness decreases when we are in a good mood. This is due to ancient human instincts and the workings of the amygdola. This is the part of the brain that is responsible for warning of danger. If the situation is comfortable and safe, then our brain receives the appropriate signal. As a result, the general mood improves, and we become less alert and critical of information.
A simple example of an availability heuristic. You learned that there were two plane crashes last month, and therefore decided to continue to travel only by train. In fact, there is a risk in the second case, but your decision was influenced by the availability of data.
Our System 1 loves coherence, that is, cause-and-effect relationships. We try to create coherent stories like in the movies, even if the reality is different. To do this, we turn to System 2, which looks for the missing facts. Therefore, we are very willing to believe in the success stories of celebrities, while not considering their useful connections or resources, their real mistakes and failures.
For the same reason, we evaluate random events as legitimate. For example, after 3-4 successful throws of one athlete, the conviction arises that he has an easy hand and in the future he will play just as successfully. Although hitting the target could be an accident, and the sample (number of throws) is very small, such illusions are quite common.
Framing is one of the manifestations of the principle “What I see is what I see”. This means that when we change the wording, we change our mind. For example, the phrase “disease death rate is 10%” causes us more anxiety than “disease survival rate is 90%”, although the meaning of the statements is the same.
If System 1 does not find a quick answer to a question, it looks for an easier question of a similar nature and answers it.
For example, to the question “How happy are you?” System 1 is pleased to offer us the question “What was my mood lately?”. And if you are asked what is the unemployment rate in the country, it is likely that first of all you will remember whether relatives and friends have a job.
The priming (precedence) effect proves that in many ways we do not think independently and are influenced.
The spoken ideas and words influence your thoughts. For example, if you are asked after the word "food" to name another four-letter word that would begin with "M", you are more likely to say "meat" than "soap."
Ideas can also influence our actions. For example, according to an experiment by Kathleen Vos, the thought of money evokes individualism in people, a desire to act independently, not to ask for help and not to help others.
The binding effect occurs in two cases. We understand in which direction the answer needs to be corrected, and we even find arguments, but we do not know the exact solution. So, a teenager can turn down the music at the request of the parents, but the compromise is not enough for both parties. For a teenager, music still sounds too quiet, for parents - too loud.
In the second case, we rely on previous information (binding via priming). If you are asked whether Leo Tolstoy was over 120 years old at the time of his death, your guess about his age will be many years higher than if the number 50 was named in the question.
What to do to avoid falling for brain tricks
- Know the enemy by sight. If you know how thinking errors can occur, they are easier to avoid. For example, when you are making an important decision as a group, make sure that the opinions of the participants are as independent of each other as possible. This avoids the priming effect of the intuitive System 1 and the lazy System 2.
- Train logical and critical thinking. When you think logically, you do not draw conclusions based on limited information, less often make hasty decisions. The habit of analyzing information can and should be developed throughout life. For example, this can be done on the LogicLike website - for both adults and children, starting from the age of five.
- Develop self-reflection. Track your actions and their reasons. The moment you discovered your mistake, think about what influenced the decision. Perhaps an emotional reaction or a reluctance to ponder the information.
- Don't stop learning. When you have knowledge and know how to use it, System 1 will more often remember the correct answers, and System 2 will spend less resources thinking about them.