How cognitive curmudgeons think

Lord777

Professional
Messages
2,579
Reaction score
1,513
Points
113
Psychologists have long known that we often overestimate the possibilities of our own thinking. We confidently answer questions, the answers to which we do not really know, and do not pay attention to the information that could suggest the correct solution.

Try to solve the following problem. Just don't look ahead and focus on the conditions:

Jack looks at Anna, but Anna looks at George. Jack is married, but George is not. Does a person bound by a marriage look at a person who is not bound by such a bond?

Answer options: 1) Yes; 2) No; 3) Impossible to determine.

This example is often found in texts about rationality and suggests that most of us are cognitive curmudgeons. It would seem that there is nothing difficult in the problem itself, but about 80% of people give the wrong answer to it. We do not know if Anna is married. Therefore, it is impossible to answer the question - this is how those who choose the third option build their reasoning.

This answer seems reasonable, but it is incorrect. There seems to be no arrogance here: we refuse to give an answer, because it is not derived directly from the conditions of the problem. But the point is, most of us simply refuse to consider other options.

Although we don't know anything about Anna, we can make assumptions. If she is married, then the first answer will be correct (married Anna looks at unmarried George). If she is not in him, then again the first option is correct, and not the third (married Jack looks at unmarried Anna).

Therefore, under any conditions, the first answer is correct. Intellectual modesty turns out to be intellectual stinginess - we choose the simplest solution because we automatically strive to save mental resources.

The phenomenon of "cognitive miser" is well studied in cognitive and behavioral psychology. For the first time this concept was used by American psychologists Susan Fiske and Shelley Taylor in their 1984 book Social Cognition. So they described the strategy of cognition and behavior, which consists in reducing new knowledge to existing one.

In the future, such errors were considered as failures in the processing of available information, or heuristic errors. Research on heuristics once glorified Daniel Kahneman, who later received the Nobel Prize for a number of his discoveries. One of the common heuristics is the recognition heuristic.

If a person is asked which of the two cities is larger, he will name the one that he knows - if he knows that this city is big. If he knows that this city is small, then he will choose an unfamiliar city.

In making decisions, we rely on the information that we already have. And it works well in most cases. But often this strategy makes it difficult to take into account the additional information that the situation reveals to us. This leads to mistakes that happen everywhere in our life - and their consequences can be much more serious than wounded pride and the wrong answer in the problem of conditionally married and unmarried people.

The cognitive curmudgeon tends to spend as little time and energy as possible thinking. He is usually not ready to view the situation from different points of view. Let us explain: the cognitive curmudgeon is not a conventional figure, not some particular type of character and not a specific way of thinking. This is a characteristic that, to some extent, is inherent in all people. To be stingy with our intellectual resources is a strategy that evolution has laid in us. Individual differences do not really matter here.

Psychologists usually distinguish two types of thinking: fast and slow. The first one works automatically, does not require a lot of energy and turns on instantly. Slow thinking turns on when solving certain tasks and problems, requires concentration of attention and takes a lot of energy.

In the problem about Anna, George and Jack, quick and emotional reactions, which are of the first type, will not help us in any way. But slow thinking, as we see, often fails here. Therefore, within the framework of the "slow" type of thinking, it will be useful to highlight its algorithmic and reflective levels.

Algorithmic thinking, as the name implies, operates according to previously known algorithms - this is a kind of substitution of formulas that can be easily learned. It also helps us divide each task into several elements and move sequentially from one part to another.

Reflexive thinking requires even more energy, since it helps not only to solve the problem, but also to comprehend it from different angles - to turn it over in your hands and take a closer look at how it works. Reflexive thinking calls into question those conditions that are given to us as a matter of course. In everyday life, we do this very rarely.

We often succeed in finding a solution to a problem, but it is far from always possible to redefine this problem.

To better understand what is at stake, think about another problem (it is based on research by Professor of Medicine Peter Ubel). 200 children, divided into groups A and B, are awaiting liver transplant surgery. You only have 100 organs to transplant. How do you distribute them? You can donate 50 organs to group A, and the remaining half to group B, and this will be quite fair. But what if the division into groups is carried out in accordance with the prognosis of the disease?

Let's say the first group includes 100 children with an 80 percent chance of recovery, while the second group has only a 20 percent chance of recovery. In the Ubel experiment, more than a third of the participants, in this case, distributed the organs equally between the groups, arguing that "hope should be given even to those who have almost no chance." If the children were not divided into groups, the participants calmly allocated organs in accordance with the probability of survival. One had only to get rid of the word "group" by changing the wording of the task, as its solution changed fundamentally.

When the children were divided into groups, the participants in the experiment acted irrationally, guided by abstract notions of justice. The error arose precisely because of cognitive stinginess: if you think about the conditions of the problem, it becomes clear that the 50/50 distribution cannot be justified by any reasonable arguments.

But the cognitive curmudgeon doesn't think about conditions. He acts - sometimes very ingeniously - in the conditions that are presented to him by the situation itself. There is nothing wrong with that when it happens in thought experiments. But imagine that in medical practice they will act in the same way (and similar, albeit not so obvious cases do occur everywhere).

“Cognitive curmudgeons allow the structure of the world around them to control their thoughts. Cognitive curmudgeons agree with any way a problem is presented and start from a given point, never thinking that if the question was presented differently, they would draw different conclusions."

- Keith Stanovich, Professor, Department of Human Development and Behavioral Psychology, University of Toronto

Keith Stanovich, in her book Rational Thinking, convincingly shows that intelligence and rationality are not the same thing. People with PhDs and IQs above 120 may be just as cognitive curmudgeons as the most recent Losers.

To master algorithms for solving problems well does not mean to become rational. You need to be able to put the given conditions into question, otherwise they will control us, and not we - them. It is quite possible to move from cognitive stinginess to generosity. However, it will not always be possible to manifest it: even the most critical critic will not have enough strength for this. Reflexive thinking requires energy and increased mindfulness, but it too can be learned if you want to.
 
Top