A closer look at rationality

Keith Stanovich has written an interesting book titled What Intelligence Tests Miss. The book is about the fact that IQ tests are incomplete measures of cognitive functioning. There is, as studies have show, in fact only a low to medium correlation between rational thinking skills and IQ test performance. And because rational thinking skills and IQ are largely independent it is not surprising that intelligent people can easily behave irrationally and hold false and unsupported beliefs. Several things are really interesting about this book. One is the authors insight that we do not need to stretch to non-cognitive domains (to notions as emotional intelligence or social intelligence) to see the lacunae in IQ tests. Another is the very specific and research based analysis of the topic matter.


The tripartite framework of cognitive functioning
The author presents an elegant and rather comprehensive model of cognitive functioning in which three types of major thinking processes and their interrelations are described: the autonomous mind, the algorithmic mind and the reflective mind.
The autonomous mind refers to rapidly executed, non-consciousness requiring mental processes which are often quick and dirty. The algorithmic mind refers to conscious efficient information processing and is linked to what is usually referred to as fluid intelligence. The reflective mind is linked to rational thinking dispositions and deals with questions such as which goals to choose and why, and what action to take given those goals. As the figure shows, conscious thinking can override unconscious thinking, which is a good thing given the quick and dirtiness of the autonomous mind. The algorithmic mind is required for executing this override and thus very important. But the reflective mind is the process which initiates such an override. People with high IQ may be quite capable of overriding false beliefs and erroneous judgments but it takes the rationality of the reflective mind to initiate such an override.

Intelligent but irrational
Although many laymen and psychologists seem to think IQ tests do measure rationality, they actually don’t. In fact, intelligence, as measure by IQ tests correlates only low to moderately with rational thinking skills. According to Stanovich, this explains why it is not strange to see intelligent people behave irrationally and hold false and unsupported beliefs. Some real world examples are: intelligent people who fall prey to Ponzi scheme swindlers like Bernie Madoff, a highly educated person who denies the evidence for evolution, a United States president who consults an astrologist, and so forth. Below, I will try to summarize how Stanovich explains rationality and lack of rationality.

What is rationality?
Cognitive scientists distinguish two basic forms: 1) INSTRUMENTAL RATIONALITY, behaving in such a way that you achieve what you want, and 2) EPISTEMIC RATIONALITY, taking care that your beliefs correspond with the actual structure of the world. Irrational thinking and behaving is associated with three things.

The first is an overreliance on the autonomous mind which subconsciously and automatically uses all kinds of heuristic to come to conclusions and solve problems. The autonomous mind is fast and very valuable but also very imprecise. It is prone to all kinds of biases. Thinking deliberately instead of letting the autonomous mind make judgments cost much more time and energy which is why it is temping no resist.

The second thing which is associated with irrationality is what is called a mindware gap. The term ‘mindware ‘ refers to the rules, knowledge, procedures, and strategies that a person has available for making judgments, decisions and solving problems. Lack of such knowledge, etc hinders rationality.

The third thing which is associated with irrationality is something called contaminated mindware, beliefs, rules, strategies, etc that are not grounded in evidence and that are not good for the one who holds them (the host) but which can still spread easily throughout a population. There are several reasons why they can spread easily: 1) they are often packaged in an appealing narrative which promises some kind of benefit to the host, 2) they sometimes ride on the back of other popular mindware which may be more valid by copying superficial characteristics from that mindware, 3) they contain self-replication instructions (‘send this mail on to 10 different people’), 4) they may have evaluation-disabling properties (for instance by claiming that evidence is not relevant or possible, by making belief which is unsupported by evidence into a virtue, by encouraging adherents to attack non-believers, etc). You might think that intelligence would guarantee a good protection against contaminated mindware but this turns out to be wrong. By making narratives complex, highly intelligent people can even become extra attracted to them. Further, studies have demonstrated that intelligent people may be more capable of creating ‘islands of false beliefs’ or ’webs of falsity’ by using their considerable computational power to rationalize their beliefs and to ward off the arguments of skeptics.

Some deliberations on the desirability of rationality
Here are some thoughts and questions about what the view presented in the posts might imply. Let me start by saying that I find the basic ideas presented in Keith Stanovich's book convincing, namely that: 1) Intelligence as measured by IQ tests and rationality are largely independent, which explains why intelligent people may behave and think irrationally, 2) IQ tests don't measure rationality and contrast between the strong focus on IQ testing and the very limited attention to measuring and teaching rational thinking is a bad thing, 3) rational thinking could be taught more and this would lead to social benefits. Here are some additional thoughts and questions on the desirability of raising rationality.

Perfect rationality is out of the question. That this is so can be understood from an evolutionary perspective. As Stanovich explains in his book, evolution does not lead to perfect rationality because natural selection does not specifically favor maximizing truth or utility. Instead it favors genetic fitness in a local environment. This means developing rationality is a matter of optimization instead maximization. Spending extreme resources on building rationality does not guarantee evolutionary advantage because those resources might also have been spent on other useful things. As Richard Dawkins says in his latest book: "Perfection in one department must be bought in the form of a sacrifice in another department".

That maximal rationality is undesirable and impossible also follows from Stanovich's tripartite model of the brain which consists of the autonomous mind, the algorithmic mind and the reflective mind (further explanation here). It is true that the autonomous mind works with rough heuristics which work in a quick and dirty way and which may frequently miss the mark. An override by the deliberate part of the brain (which consists of the algorithmic brain plus the reflective brain) can help to correct the inaccurateness of the autonomous mind and make judgments and decisions more rational. But because deliberate thinking demand so much attention it would be impossible to let deliberate thinking make all judgments and decisions. So much of everything we do and think has been 'delegated' to the autonomous mind that this would be unthinkable. Some division of labor between the autonomous mind and the deliberate mind is efficient. The question is how to divide it most effectively. How often and when should the deliberate mind override the autonomous mind? How can we recognize situations which ask for such overrides? When must we demand rationality from ourselves and from others?

Another perspective on the question of how much rationality follows from looking at its advantages and disadvantages. It seems logical that increasing ones rationality is usually beneficial, both for the individual and for society. After all, increasing instrumental rationality means that one becomes better and goal directed thinking and acting. And increasing epistemic rationality means that ones maps of the world become more realistic; in other words ones beliefs about reality correspond more closely to the actual structure of reality. But there may be som disadvantages, too. I am not talking about the stereotype of Mr. Spock, the assumption that there is a trade off between rationality and social or emotional competence. I would predict that rationality and social or emotional competence are largely independent (in the same that rationality and intelligence are largely independent). In am talking about the possibility that increasing your rationality may be aversive to others and might lead to some extra social barriers, like social rejection. History shows many examples of people who are now considered to be ahead of their time in terms of rationality who were punished by their contemporaries. People challenging widely held beliefs (never mind if they are true or not) can be considered as a threat to power positions, to the stability of institutions, or can be viewed as disloyal, crazy or arrogant. There are many examples of people who have been ridiculed, isolated, imprisoned, banned, imprisoned, convicted to death and murdered because of their ideas which later turn out to be true. The paradox seems to be: it requires rationality to appreciate it.

How to fight contaminated mindware
This leads me to the question of how to fight contaminated mindware. Contaminated mindware refers to a belief system which is not true and potentially harmful to the person who holds it and others but which can still spread quickly through a population due to some of its characteristics. The question is whether a head on attack of popular contaminated mindware will leads to its demise or runs the risk of making it even more popular. A head on attack might lead to further publicity for the contaminated mindware, thus exposing more people to its attractiveness. And it may lead to more attacks on its opponents (because contaminated mindware often contains an instruction to attack opponents, non-believers). Or might a different approach work better? For instance an approach of teaching people to recognize contaminated mindware more easily and protect themselves better against it?

Comments

Popular posts from this blog

Case Profile: Edgar Cayce

Feedback in Three Steps

Esoteric Aspects of Edgar Cayce's Life