![]() | Devlin's Angle |
The following problem is typical of the scenarios considered by Tversky and Kahneman.
Imagine you are a member of a jury judging a hit-and-run driving case. A taxi hit a pedestrian one night and fled the scene. The entire case against the taxi company rests on the evidence of one witness, an elderly man who saw the accident from his window some distance away. He says that he saw the pedestrian struck by a blue taxi. In trying to establish her case, the lawyer for the injured pedestrian establishes the following facts:
If you were on the jury, how would you decide?
If you are at all typical, faced with eye-witness evidence from a witness who has demonstrated that he is right 4 times out of 5, you might be inclined to declare that the pedestrian was indeed hit by a blue taxi, and assign damages against the Blue Taxi Company. Indeed, if challenged, you might say that the odds in favor of the Blue Company being at fault were exactly 4 out of 5, those being the odds in favor of the witness being correct on any one occasion.
The facts are quite different. Based on the data supplied, the mathematical probability that the pedestrian was hit by a blue taxi is only 0.41, or 41%. Less than half. In other words, the pedestrian was MORE likely to have been hit by a black taxi than a blue one. The error in basing your decision on the accuracy figures for the witness is that this ignores the overwhelming probability, based on the figures, that ANY taxi in the town is likely to be black. If the witness had been unable to identify the color of the taxi, but had only been able to state--with 100% accuracy, let us suppose--that the accident was caused by a taxi, then the probability that it had been a black taxi would have been 85%, the proportion of taxis in the town that are black. So BEFORE the witness testifies to the color, the chances are low--namely 15%--that the taxi in question was blue. This figure is generally referred to as the 'prior probability', the probability based purely on the way things are, not the particular evidence pertaining to the case in question. When the witness then testifies as to the color, that evidence INCREASES the odds from the 15% prior probability figure, but not all the way to the 80% figure of the witness's tested accuracy. Rather the reliability figure for the witness's evidence must be combined with the prior probability to give the real probability. The exact mathematical manner in which this combination is done is known as 'Bayes' law', which gives the answer to be 41%.
(Okay experts, I'm being a bit loose in how I expressed things in the above paragraph. But MAA Online is open to all-comers, not just those with a course in probability theory under their belts. This is a column, not a lesson. Indeed, it's a column about people's intuitions, not about mathematical precision.)
The moral many writers try to draw from problems like the taxi-cab scenario is that human beings are innumerate, and as a result are not always able to make rational decisions. "Improve our math classes in the schools," say these experts, "and we will all be better equipped to make sound judgments." Now, improving school mathematics instruction may or may not have a beneficial effect on society. But to my mind, the fact that the vast majority of people get the taxicab problem 'wrong' is not an argument in favor of teaching mathematics, and it certainly does not show that "innumerate" people make "poor" judgments. What that particular example shows, I would suggest, is that there can be a big difference between rational behavior and numerically or logically based behavior. The jurist who assigns blame to the Blue Cab Company might display a form of innumeracy, but he or she could still be acting rationally. Indeed, the jurist probably is acting rationally. Here's why.
First, let's be clear about what the use of Bayes' law tells us in this case. Providing that the stated proportions of blue and black taxis, namely 15% blue and 85% black, is uniform throughout the town, or at least that these figures are reliable in the region where the accident occurred, then the 41% figure for the chance of the pedestrian being hit by a blue taxi is accurate. Given the right circumstances, Bayes' law is totally correct. So, assuming that the various figures quoted are reliable, the probability that the Blue Comany is guilty is indeed a mere 0.41, and the 'chances are' that they are not to blame. Clearly, there is more than enough 'reasonable doubt' here, and a rational jurist to whom this application of Bayes' law is explained should act accordingly.
On what basis then do I claim that the jurist who, ignorant of the use of Bayes' law, decides the Blue Company is at fault, can be said to be acting rationally? Well, over thousands of years of evolution, human beings have learned to make decisions that are beneficial--beneficial firstly to themselves and their nearest and dearest, then to others in the society. Only very rarely are they in full possession of enough information to make what an analyst would declare to be the 'best' decision. Typically, humans have to make decisions based on very limited information, quite often only the information provided by their own eyes, ears, and sense of smell. During the course of our evolutionary history, we have become very good at making optimum decisions based on such evidence. Those that were not adept at identifying potential danger often did not survive long enough to pass on their genes. Moreover, seeing, hearing, and smelling have a conscious immediacy that gives us overwhelming faith in information we acquire through those senses, far more than information we read about or are told of. Seeing is particularly strong in this regard. Thus, both on evolutionary grounds and our own conscious experience, we tend to put great significance on information acquired first hand through sight, and what is more it is entirely rational to do so.
It is entirely consistent with our rational tendency to rely on information acquired by actually seeing something to likewise place great reliance on information that is directly reported to us by others who have acquired it by seeing. `"I saw it with my own eyes," amounts to a personal guarantee of truth when someone reports something he has seen. By contrast, neither evolution nor our own experience has equipped us to have a 'feel' for highly abstract information based on numerical data about a large population we cannot possibly see. The information that 15% of the taxicabs in the town are blue and 85% are black and that their distribution through the town is uniform is mathematically precise but entirely abstract---we do not SEE it.
In our daily lives, though we are constantly faced with evaluating information and making decisions, in many cases decisions on which our lives quite literally depend (crossing the street, driving a car, etc.), we rarely do so on the basis of statistical data of the taxicab variety. It is then hardly surprising that most of us, when faced with the kind of problem facing the jurist in the taxicab case, tend to downplay evidence based on statistical data, and put great significance on eye-witness accounts. In the taxicab case, it is undoubtedly wrong to reason that if a series of tests show that the eye-witness is right 4 out of 5 times, then the probability of what he says being true on the occasion in question is also 4 in 5. But it is not at all irrational to reason in this way, and those of us whom society considers 'numerate' should not, I suggest, sneer at those who "get the problem wrong." In many ways, and certainly in human terms, the popular answer is the "right" one. As a jurist, you could only be accused of irrationality if, faced with a clear explanation of the application of Bayes' law to the case, you refuse to change your original evaluation of the eye-witness's evidence.
The above is adapted from the book Goodbye Descartes: The End of Logic and the Search for a New Cosmology of Mind, to be published by John Wiley and Sons in January 1997.
Devlin's Angle is updated at the start of each month.