## Devlin's Angle |

With some trepidation, toward the end of my article I referred back to an earlier column in which I had discussed the infamous Monty Hall Problem. My trepidation was not unfounded. I also received emails from readers who looked back at that earlier piece and were absolutely convinced that the solution I had given (in that earlier column) was wrong. More interesting were emails from people who posed slight variants of the original Monty Hall problem. In this week's column, I'll consider the most common of those variants.

The Monty Hall story so far. In the
1960s, there was a popular weekly US
television quiz show called *Let's Make a Deal.* Each week, at a certain point in the program, the host, Monty Hall, would present the contestant with three doors. Behind one door was a substantial prize; behind the others there was nothing. Monty asked the contestant to pick a door. Clearly, the probability of the contestant choosing the door with the prize was 1 in 3 (i.e., 1/3). So far so good.

Now comes the twist. Instead of simply opening the chosen door to reveal what lay behind, Monty would open one of the two doors the contestant had not chosen, revealing that it did not hide the prize. (Since Monty knew where the prize was, he could always do this.) He then offered the contestant the opportunity of either sticking with their original choice of door, or else switching it for the other unopened door. (As the game was actually played, some weeks Monty would simply let the contestant open their chosen door. The hypothetical version of the game described here, where Monty always opens the door and makes the "switch or stick" offer, is the one typically analyzed in statistics classes.)

The question now is, does it make any difference to the contestant's chances of winning to switch, or might they just as well stick with the door they have already chosen?

When they first meet this problem, many people think that it makes no difference if they switch. They reason like this: "There are two unopened doors. The prize is behind one of them. The probability that it is behind the one I picked is 1/3, the probability that it is behind the one I didn't is the same, that is, it is also 1/3, so it makes no difference if I switch."

A common variant is for people to think that the two probabilities are not 1/3 and 1/3, but 1/2 and 1/2. Again, the intuition is that they are faced with two equally likely outcomes, but instead of regarding them as two equal choices that remain from an initial range of three options, they view the choice facing them as a completely new situation.

Surprising though it may seem at first,
however, either variant of this reasoning is wrong. Switching actually *doubles* the contestant's chance of winning. The odds go up from the original
1/3 for the chosen door, to 2/3 that the other unopened door hides the prize.

There are several ways to explain what is going on here. Here is what I think is the simplest account.

Imagine you are the contestant. Suppose the doors are labeled A, B, and C. Let's assume you (the contestant) initially pick door A. The probability that the prize is behind door A is 1/3. That means that the probability it is behind one of the other two doors (B or C) is 2/3. Monty now opens one of the doors B and C to reveal that there is no prize there. Let's suppose he opens door C. (Notice that he can always do this because he knows where the prize is located.) You (the contestant) now have two relevant pieces of information:

[1] The probability that the prize is behind door B or C (i.e., not behind door A) is 2/3.Combining these two pieces of information, you conclude that the probability that the prize is behind door B is 2/3.[2] The prize is not behind door C.

Hence you would be wise to switch from the original choice of door A (probability of winning 1/3) to door B (probability 2/3).

Now, if you have never seen this problem before, or have still not managed to "see the light", there is really little point in you reading on. (Besides, you probably can't resist spending your time instead emailing me to tell me my reasoning is fundamentally flawed.) But if you have, perhaps with great effort, come to convince yourself that the above reasoning is correct, then prepare yourself for another shock.

Consider a slightly modified version of
the Monty Hall game. In this variant,
after you (the contestant) have chosen
your door (door A, say), Monty asks
*another contestant* to open one of the other two doors. That contestant, who like you has no idea where the prize is, opens one at random, let us say, door C, and you both see that there is no prize there. As in the original game, Monty now asks you if you want to switch or stick with your original choice. What is your best strategy?

If you adopt the reasoning I gave earlier for the original Monty game, you will arrive at the same conclusion as before, namely that you should switch from door A to door B, and that exactly as before, if you do so you will double your likelihood of winning. Why? Well, you will reason, you modified the probability of the prize being behind door B from 1/3 to 2/3 because you acquired the new information that there was definitely no prize behind door C. It does not matter, you will say, whether door C was opened (to reveal no
prize) by deliberate choice or randomly.
Either way, you get the same crucial
piece of information: *that the prize
is not behind door C. * The original
argument remains valid. Doesn't it?

Well, no, as a matter of fact it doesn't. In the original Monty problem, Monty knows from the start where the prize is, and he uses that knowledge in order to always open a door that does not hide a prize. Moreover, you, the contestant, know that Monty plays this way. This is crucial to your reasoning, although you probably never realized that fact.

Here, briefly, is the argument for the variant game:

You choose one door, say, door A. The probability that the prize is there is 1/3.

The probability that the prize is behind one of door B and door C is 2/3.

The other contestant has a choice between door B and door C. The odds she faces are equal. Assume she picks door C. The probability that she wins is 1/2 x 2/3 = 1/3.

The probability that she loses is likewise 1/2 x 2/3 = 1/3. And that's the probability that you win if you switch. Exactly the same as if you did not.

Confused? As sometimes arises in mathematics, when you find yourself in a confusing situation, it may be easier to find the relevant mathematical formula and simply plug in the appropriate values without worrying what it all means.

In this case, the formula you need is due to an 18th Century English Presbyterian minister by the name of Thomas Bayes. Bayes' formula languished largely ignored and unused for over two centuries before statisticians, lawyers, medical researchers, software developers, and others started to use it in earnest during the 1990s.

Bayes' formula shows you how to calculate the probability that a certain proposition S is true, based on information about S, when you know:

(1) the probability of S in the absence of any information;Let P(S) be the numerical probability that the proposition S is true in the absence of any information. P(S) is known as the(2) the information about S;

(3) the probability that the information would arise regardless of whether S or not;

(4) the probability that the information would arise if S were true.

You obtain some information E.

Let P(S|E) be the probability that S is
true given the information E. This is the revised estimate you want to calculate.
It is called the *posterior
probability.*

A quantity such as P(S|E) is known as a
*conditional probability* - the
conditional probability of S being true, given the information E.

Let P(E) be the probability that the information E would arise if S were not known to be true and let P(E|S) be the probability that E would arise if S were true.

The ratio P(E|S)/P(E) is called the
*likelihood ratio* for E given S.

Bayes' theorem says that the posterior probability P(S|E) is derived from the prior probability P(S) by multiplying the latter by the likelihood ratio for E given S:

P(S|E) = P(S) x P(E|S) / P(E)Notice how the formula reduces the problem of computing how probable S is, given the information, to computing how probable it would be that the information arises if S were true.

When you apply Bayes' formula to the Monty Hall problem, you begin with an initial value for the probability attached to a proposition that the prize is behind the unchosen door B, say, namely 1/3. This is the prior probability. Then you modify that probability assessment based on the new information you receive (in this case, the opening of door C to reveal that there is no prize behind it) to give a revised, or posterior probability for that proposition, which works out to be 2/3.

Here is the computation in full detail.

You select door A, and Monty opens door C to reveal that there is no prize there. So you now know that p(C) = 0. What are your new estimates for p(A) and p(B)?

We will apply Bayes' formula. Let E be the information that there is no prize behind door C, which you get when Monty opens that door. Then:

p(A|E) = p(A) x p(E|A) / p(E)

p(B|E) = p(B) x p(E|B) / p(E)

We need to calculate the various probabilities on the right of these two formulas.

p(A) = p(B) = 1/3.

p(E|A) = 1/2, since if the prize is behind A, Monty may pick either of B, C to reveal that there is no prize there.

p(E|B) = 1, since if the prize is behind B, Monty has no choice if he wants to open a door without a prize, he must open C.

p(E|C) = 0, since if the prize is behind C, Monty cannot open it.

Since A, B, C are mutually exclusive and exhaust all possibilities:

p(E) = p(A).p(E|A) + p(B).p(E|B) + p(C).p(E|C)

= (1/3).(1/2) + (1/3).(1) + (1/3).0

= 1/2

Hence, applying Bayes' formula:

p(A|E) = p(A) x p(E|A) / p(E) = (1/3) x (1/2) / (1/2) = 1/3

p(B|E) = p(B) x p(E|B) / p(E) = (1/3) x (1) / (1/2) = 2/3

Thus, based on what you know after Monty opened door C to show that there was no prize there, you estimate the chance of winning if you stick is 1/3 and if you switch is 2/3. So you should switch.

Now let's consider the variant game where another contestant opens door C and you both see that there is no prize there. This time, the various probabilities are:

p(A) = p(B) = 1/3.

p(E|A) = p(E|B) = 1/2, since if the prize is behind A, the other contestant may pick either of B, C to reveal that there is no prize there.

p(E|C) = 0, since if the prize is behind C, if the other contestant were to open it, you would see the prize.

Thus:

p(E) = p(A).p(E|A) + p(B).p(E|B) + p(C).p(E|C)

= (1/3).(1/2) + (1/3).(1/2) + (1/3).0

= 1/3

Hence, applying Bayes' formula:

p(A|E) = p(A) x p(E|A) / p(E) = (1/3) x (1/2) / (1/3) = 1/2

p(B|E) = p(B) x p(E|B) / p(E) = (1/3) x (1/2) / (1/3) = 1/2

Thus, it makes no difference whether you stick or switch.

So there you have it.

Whether you believe it is another matter.

Devlin's Angle is updated at the beginning of each month.