Except, of course, that we're not. But it is true that humans are relatively bad at purely rational thinking. This should not perhaps be surprising to us: We are not, after all, computers, which are far better than are humans at making rational decisions and providing rational calculations about situations. This is not entirely a bad thing: Humans have apparently (though the process of evolution) sacrificed the ability to make perfectly rational calculations for the ability to excel at what those who are trying to teach computers to think like humans call fuzzy thinking. We are good, for example, at being able to read another person's internal emotional state by the tilt of their eyebrows but we are relatively bad at calculating the odds of whether to take another card in blackjack - to the unending enrichment of the Las Vegas casinos.
However, while there do seem to be trade-offs for not being as skilled at rational thought as we might like to think that we are, this does not mean that we should not attempt to understand in a systematic way how it is that humans tend to make mistakes in their rational calculations so that (if we choose) we can act more naturally than is typical (or arguably natural) of humans. This paper examines two of the systematic mistakes that humans tend to make when they make decisions that they are likely to consider to be rational: The mistakes (or inclinations) toward both pessimistic and optimistic biases.
Although one might suspect that humans would be inclined to err in one direction or the other in a systematic way (i.e. To guess on a regular basis that their chances are better than they actually are or to guess on a regular basis that their chances are worse than they actually are), individuals are in fact likely to make faulty decisions in both directions, although in different circumstances. For example, humans are far more likely than is justified by reality to believe that they are likely to be affected by misfortune.
This is in part because people are likely to overestimate the likelihood of events that are in fact quite rare. If you were to take a survey of people walking down the sidewalk on a typical New York street, for example, a number of those people would be likely to report that they are afraid of dying in a terrorist attack. This is, of course, not irrational per se, but even in our post 9/11 world an individual is far more likely to die of cancer or heart disease than they are in a terrorist attack. And yet while heart disease is relatively avoidable (one can certainly significantly reduce one's chance of heart disease by eating well, exercising sensibly, lowering the fat in one's diet, etc.) most individuals do not act to do so - in some part at least because they are too busy worrying about the rare likelihood of a terrorist attack rather than getting their cholesterol checked.
This phenomenon is summarized below:
People overestimate the frequency of infrequent events and underestimate frequent ones.
Many people are more afraid of dying in an act of terrorism than they are of dying in an auto accident, despite the fact that they are MUCH more likely to die in an auto accident (http://www.math.byu.edu/~jarvis/gambling/gambling-fallacies.html).
This can be seen to be a mistake (in terms of purely rational assessment of a situation along the lines of mathematical probability) that includes bias both toward the positive and the negative. Even as people are more inclined than they should be to believe that something terrible and rare will happen to them they are at the same time inclined to a sort of irrational optimism in which they believe that they are likely to escape unfortunate (and even lethal) occurrences that are in fact quite common.
Another way of summarizing this is suggested in the following example (or perhaps we might better see it as a parable):
If you were told that you have a one in fourteen million chance of getting cancer in the next seven days people will say 'oh well it is obviously not going to happen to me it is so infinitesimal' but the fact that there is a one in fourteen million chance of winning the lottery people think 'yes, it's got to be someone why can't it be me'" (http://www.bbc.co.uk/worldservice/sci_tech/features/figure_it_out/lottery.shtml).
Humans are also inclined to commit what logicians and mathematicians refer to as the "availability error" (this is especially true in gambling). This is the tendency "to focus only on good, unusual, or easily remembered experiences" rather than remembering more common, but less interesting or fortunate events (http://www.math.byu.edu/~jarvis/gambling/gambling-fallacies.html).In other words, we are likely to remember (in a fit of optimistic irrationality) that someone bought a winning lottery ticket at the very same grocery store where we ourselves buy the odd ticket than to acknowledge that thousands of people have bought losing tickets at this same store.
Part of the irrationality in the kinds of assessments that we have been discussing may well reflect a simple lack of mathematical skills. In other words, no small part of the reason that people tend to make irrational decisions arises from their mathematical shortcomings. (This is not to say that it does not reflect a tendency toward the irrational in human thought; it might well be argued that people's essential lack of skill in mathematical ability is either the cause of or in fact the same as our inability to excel at rational decision-making processes.
In a recent survey 21% of people thought that if they put the same numbers on to the lottery for the rest of their lives that they would have a chance of winning. The reality is they would have to put the same numbers on 135-000 years before they would have an evens chance of winning.
People really don't understand what it means to have a one-in-a-14-million chance. People have no idea how big 14 million is." (http://www.math.byu.edu/~jarvis/gambling/gambling-fallacies.html).
A tendency to drastically underestimate the frequency of coincidences is a prime characteristic of what researchers call innumerates, who generally accord great significance to correspondences of all sorts while attributing too little significance to quite a conclusive but less flashy statistical evidence (Paulos, 2001, p. 35).
Another way of saying this is that coincidence is far more common than we tend to think that it is, and while it is perfectly acceptable for us to find amusement or pleasure in such coincidences, we should not read into them any mystical or grand meaning. (In the same way we may be enchanted by the color and fragrance of an apple blossom while crediting both hue and scent to the mechanistic demands of evolution rather than the whim of the gods. There is no reason at all that rationality and an appreciation of pleasure cannot go hand in hand.)
Much of the tendency either to overestimate bad possible outcomes (i.e. To have a bias toward a negative irrationality) or the tendency to overestimate good possible outcomes (i.e. To have a bias toward a positive irrationality) arises in large measure from people's inability to understand (or to create for themselves) a fair sample of a larger population. One of the strengths of human intelligence (an in many ways this serves us very well) is our ability to generalize from a few examples of an entire class to the entire class.
This ability has clear evolutionary advantages. Say, for example, that an early human has seen one or two bear attacks on other humans. Shortly after the second attack, that human sees a bear ambling toward her from some distance away. Basing her reaction on only two previous encounters with bears, this human decides to run as fast as possible in the opposite direction. This is a choice with a number of clear evolutionary advantages (not being eaten is a good way to pass on your genes to the next generation).
It might well be, of course, in this particular case, that our proto-human has made an error: This bear might not in fact be hungry at all, having already recently dined on someone from the next set of caves. Thus the human in this case would in fact be guilty of making an irrational calculation weighted toward the negative. But how many of us in the same situation would not also make this same calculation - given the possible outcome if we were to assume otherwise?
Here is a summary of this human (pragmatic if not strictly rational) tendency to generalize:
While we sometimes learn general principles by being told about them, it is certainly an essential part of human learning for us to form generalizations by observing what goes on around us. We don't have to be told peaches have pits if we have eaten a lot of them and found a pit in every one. The basketball coach evaluating a new player can decide…