Daniel Kahneman, who wrote "Thinking, Fast and Slow," has spent many years dissecting the way people think, and how they arrive at thoughts. He is a psychologist who for many years worked with a fellow psychologist named Amos Tversky who passed away before the two could publish much of their findings and win the Nobel Prize. However, his greatest work, and Nobel Prize, were not for psychology (for which there is no Nobel), was for economics. Kahneman developed a method for determining how people would gamble if the end choice was a known. He is a prolific author and thinker who was primarily interested in error.
In his book, he details how he and his colleague Tversky came across errors in judgment that are fundamental to much of research, and much of how people anecdotally see each other. The observances he made took place over a period of more than 30 years during which he used them to create prospect theory, corollaries in behavioral economics and other advances. He determined that there are a number of errors to which people are prone which impede their use of critical thinking. He said that there are two systems controlling the brain, System I and System II, the first of which is "gullible and biased to believe" and the second "is in charge of doubting and unbelieving" (81). His belief is that if the first system somehow overrides the second, or if the second system is somehow otherwise engaged, people will make errors in judgment because System I is prone to make them. This paper looks at three of the errors people commonly make when it comes to critical thinking -- exaggerated emotional coherence, intensity matching and the anchoring effect.
Exaggerated Emotional Coherence
There is a phenomenon that is common in people regardless their desire to suppress it. Exaggerated emotional coherence, more popularly known as the halo effect, is defined by Kahneman as "the tendency to like (or dislike) everything about a person -- including things you have not observed" (82). He basically illustrates it by using politics. In the United States, there are two primary parties which pretty much rule political discourse. These two parties have been labeled, by many as adhering to either conservative or liberal principles which are diametrically opposed to one another. This makes it easy for adherents of one party or the other to agree vehemently with one side while dismissing everything that the other side says. When observed from the outside this is obviously flawed logic since no person is correct about an issue 100% of the time, but that is the strength of the halo effect.
Kahneman says this happens as a sort of layering effect. Thinking of the politician, they may say one thing that an individual agrees with, then a columnist the individual likes has good things to say about the politician, then a close personal friend reports that they too think that the politicians stances are sound. This layering of opinions, whether they take place in the conscious or unconscious, adds to the favorable or unfavorable opinion that individual has about the politician. Eventually, if the individual hears enough positive statements from enough people they consider influential, they will apply the halo effect to all of the politicians statements and actions.
It actually does not matter how this is applied. Critical thinking is subverted by this flaw in judgment because the individual becomes irrational in their acceptance or vilification of an idea or a person. The bias is an unknown danger to the act of critical thinking because when this issue or person becomes the focus of critical thought, they have achieved a position that allows System I thinking to overshadow reason and System II thinking.
He also said that words mean things (83). When a person was described using certain bias-filled words (generous, kind, envious, hateful) people came to believe this about them and find it impossible to reverse their first impressions. He tried an experiment on himself when he found he was grading tests according to the halo effect, and found that people need to "decorrelate error" (83). By this he means that to critically evaluate a person's statements, each must be taken as a separate entity and evaluated on its own merits.…