## A new approach to the Monty Hall problem

Reams and reams have been written about the Monty Hall problem, but no-one seems to have mentioned a simple fact which, once realised, makes the whole thing seem intuitive.

The Monty Hall show is a (possibly fictional, I'm not sure) TV gameshow. One couple have beaten all the others to the final round with their incredible skill at answering questions on general knowledge and popular culture, and now have a chance to win a Brand New Car. There are three doors. The host explains that earlier, before the couple arrived, a producer on the show rolled a dice. If a 1 or a 4 was rolled, the car was placed behind the red door. If a 2 or a 5 was rolled, it was placed behind the blue door and if a 3 or a 6 was rolled, it was placed behind the yellow door.

The host invites the couple to pick which door they think the car is behind. He then opens one of the other two doors and there's no car behind the door! (He knows where the car is, so he can always arrange for this to happen). Then the host asks the couple if they want to change their mind about which door they think the car is behind. Should they change? Does it make a difference.

Most people's first reaction is that it can't matter. How can it? The car has a one in three chance of being behind each of the doors.

No-one would argue that the car has anything but a probability of 1/3 of being in behind the door the couple picked (say it's the red door). But when the host opens the blue door, magic happens. The probability of the car being behind the blue door suddenly goes to zero. The probability can't vanish (otherwise there would only be a 2/3 probability of there being a car at all) and it can't go to the red door so this ghostly 1/3 probability-of-there-being a car goes to the yellow door. The car now has a 2/3 probability of being behind the yellow door. "Poppycock!" most people would say. Probability isn't this "magic stuff" that can travel between doors. But the correct answer is that the couple should change doors - the car really does have a 2/3 probability of being behind the yellow door.

If you're in doubt, you could simulate the situation with a computer program, run it lots of time for the choices "never change doors" and "always change doors" and see what fraction of the time in each case the couple wins the car. You will find that changing makes you win 2/3 of the time, and sticking 1/3. Or you could enumerate the possibilities:

1/3: Couple picks correct door in the first place. If they change, they lose.
2/3: Couple picks the wrong door. The other wrong door is then eliminated, so if they change, they win.

So changing has a 2/3 probablity of winning. This reasoning sounds like a more plausible argument for changing doors.

The key to this matter, and what makes the whole thing confusing to those who don't realise this, is that probability depends on what you know. If you think about this for a while, it becomes obvious. A fair coin, when tossed, has a 50% probability of landing on heads. However, once the event has happened, the probablity collapses to 0% (if it landed tails up) or 100% (if it landed heads up). Let event A be the tossing of a coin at noon, and success defined by the coin landing heads up. At five seconds to noon the probability of success is 0.5. At five seconds past noon, when everybody can see that the coin landed heads up, the probability of success is 1.0. If the coin is tossed and it rolls under the sofa, then at five seconds past there is still a 50% chance of success. Although the coin has landed, no-one knows what the result is. Probability depends on what you know. If you know nothing about the coin, the probability of success is 0.5.

Suppose a neutral third party is the only one to see the coin, and says "I'm not going to tell you what it says, but I'm going to roll a dice (behind your back, so you can't see it). If it comes up even, I'll say "heads", whatever the coin says. If it comes up odd, I'll say what the coin says. But I won't tell you whether the dice came up odd or even." Suppose this third party then tells says "heads". There's a 50% chance that this was because he rolled an even number and a 50% chance that that's what the coin really said. What is the probability of success now? Well, we can enumerate the possibilities again and notice that of the four equally likely possibilities (Heads+even, Heads+odd, Tails+even, Tails+odd) the only one we've eliminated is Tails+odd, since in that case he would have said "tails". Of the remaining three possibilities (which are still equally likely), two of them involve success so the probability of success is 2/3. We can check this as follows: He says "heads" three out of four times, so the probability of success is (2/3)*(3/4)+0*(1/4)=2/4=1/2 (since we know there is 0 probability of success if he says "tails"). This is the answer we expected.

We conclude that, by cleverness, we can do a "partial collapse" of the probability by finding out a bit of information (if not all of it). In this case the knowledge that the neutral third party said "heads" doesn't give as much information about the state of the coin as seeing the coin itself - it doesn't tell us for definite whether we have heads or not, but it does impart enough information to change the probability.

This is exactly what happens in the Monty Hall problem. The host imparts some information to the couple about which door the car is behind, but not enough to tell for the couple to tell for definite which door the car is behind - just enough to shift the probability in favour of the door which they would choose if they opted to "change". If it was a complete probability collapse (i.e. if he opened any two doors) no-one would be in any doubt as to whether they should change or not. It's just because the probability has only partially collapsed that people get confused.

Justin sent me this email:

I read your paper on "Monte Hall Strikes Back." and absorbed that probability depends on what you know. Following is the question that I made up and is having trouble "partial collapsing" it. Maybe you can help me out with an insight:

There are two doors, door #1 and door #2 behind which two real numbers are written at random. You get a prize if you choose the door with larger real number. At this point, the probability of winning a prize is 1/2. However, you get a chance. You first choose a door, and Monty shows you the number behind that door. What should you do in order to do better than 1/2? Or is it even possible to do better than 1/2?

The answer to the question hinges on how these two real numbers are chosen. If all real numbers are equally likely, you can never do better than 1/2 because for any real number x the size of the set of real numbers smaller than x is exactly the same size as the set of real numbers larger than x (this is easy to prove, just pair them up: for all y>0, pair up x-y with x+y).

Of course, the game can't work like that in the real world because most of the real numbers are extremely large (either positive or negative) and require more atoms than there are in the universe to write down.

Suppose we have a more reasonable probability distribution for $x$, $P(x <= r<= x+dx) = f(r)dx$. Then, given the value behind the first door, $x$, you can calculate the probability of $y$ (the number behind the second door) being less than $x$, $\displaystyle P(y < x) = \int_{-\infty}^xf(r)dr = F(x)$, and switch doors if $P(y. Using this strategy you can calculate your probability of winning before knowing x and without even knowing the distribution!

$P(win; F(x)>\frac{1}{2}) = F(x)$
$P(win; F(x)<\frac{1}{2}) = 1-F(x)$

$\displaystyle P(win) = \int_{-\infty}^zf(x)(1-F(x))dx+\int_z^\infty f(x)F(x)dx$ where $F(z) = \frac{1}{2}$. Integrating by parts gives $P(win) = \frac{3}{4}$.

However, the probability of winning using the optimal strategy after you know x depends on the distribution and on the value of x, but can be anywhere between 1/2 and 1.

John de Pillis, a Professor of Mathematics at University of California in Riverside, emailed me to let me know about a graphical "proof" that switching doors (or cups in this case) improves your chances of success. If you're still confused, his diagram might help.

This diagram appears in John's book "777 Mathematical Conversation Starters", published by the Mathematical Association of America.

### 11 Responses to “A new approach to the Monty Hall problem”

1. mikey says:

Nice explanation.
Just FYI, Monty Hall is a real person and hosted the game show 'Let's Make a Deal', on which this problem is based.

2. The main error in the current discussions of the Monty Hall Paradox/ Problem is that everyone is considering the problem as a “one step problem” whereas it is actually a “two step problem”.
The first step is the irrelevant part of the problem and is the part in which you choose one out of three doors and then the show host opens the door that has the goat (out of the two doors that you did not choose).
The second step is the part in which you choose to change doors or not (or, in other words, the step in which you choose one of two doors).

Given that THE FIRST STEP IS ALWAYS THE SAME, i.e., you pick a door and the host opens, out of the two unchosen doors, the one that has the goat. Therefore THE FIRST STEP DOES NOT ADD ANYTHING UNKNOWN, it is thus irrelevant for the statistical calculations (it has a 1.0 probability).
The first step shall then be disregarded, for it is just a technique to create the illusion of a more complicated game.
The problem can be then focused on the second step, which always begins in the same manner: one of the three doors (the one that was opened by the host and which contains the goat) is out of the equation and you have to choose one out of two doors.
That is it, there is no need for complex calculations, the actual game is quite simple: given those TWO doors, choose one! (the “change your first choice” thing is just to create a feeling of mistery/ risk that does not exist actually).
It is then a very simple: 50/50 chance of winning the car or not.

If you want to make things more complicated and actually discuss if the person is “changing” the doors, it remains the same, for you do NOT consider 3 choices, you only consider 2 choices (the third door is always opened and, thus, removed from the discussion).
Then let’s say you initially chose (out of the 2 doors) the door with the car behind and you change, you lose, but if you do not change you win (50/50 chances). The same applies if you chose the door with the goat.

• Andrew says:

The first step isn't always the same, since there are two possibilities for what's behind the door that you chose (car or goat). They just look the same when you don't have all the information. The chance of winning isn't 50/50, as you can verify experimentally - changing doors does give you an advantage. Do you still think it's 50/50 if there are 100 doors and the host opens 98 "goat" doors?

• Mitch says:

Yes. Because regardless of what door you choose originally, there are two remaining doors that could possibly contain the car. The first step doesn't matter. Flip a coin to see if you stay or switch.

• Andrew says:

The two remaining doors don't have the same probability of containing the car though.

• Elvis says:

Brilliantly wrong. The first "step" is extremely consequential in that it will have only a 1/3 chance of being correct--and there is a 2/3 chance that the car is behind one of the doors you didn't choose. Look at it this way: you can have the choice of staying with Door 1 (33.33% of being correct), OR you can have BOTH Doors 2 and 3 (66.66% chance that the car is behind Door 2 or 3), if you switch. When Monty reveals a goat in one of the doors you didn't pick, it doesn't change the odds of 2/3 that the car is behind either of Doors 2 or 3. If he revealed Door 3 was a goat, then there is 2/3 chance that the car is behind Door 2. 50/50 has nothing to do with it.

• Oldnsenile says:

Another way to show a fallacy in the official solution, is to assume 2 players instead of 1. If player 1 selected door #1 and player 2 selected door #2, then they would have equal probabilities of winning. Also, it should be obvious that they could not both increase their probability of winning by trading doors. I don't know why a computer simulation would support an advantage of switching, unless the formulated model was in error.

• Andrew says:

But the host opens door 2 and it's empty, so at that point we know that player 2 has a 0% chance of winning. It doesn't follow that player 1 also has a 0% chance of winning - the probabilities change (and are no longer equal for the two players) when the host opens a door!

• PalmerEldritch says:

You can't play the MHP with 2 contestants. There are thousands of simulation programs, are you saying they're all in error?

3. Probability calculation by using the coins.

After the host throws open the empty door (let’s say C), which he always knows beforehand as empty (due to the placing of the prize behind a certain door and the player choosing a door for himself), the choice involving two remaining doors (A or B) for the prize and by the player in terms of two sides of coin can be considered as the head H (corresponding to door A) and the tail T (corresponding to door B). These choices are mutually exclusive in terms of H and T, implying that the player or the prize can’t be both at door A and door B.

Consider also that the prize and the player are represented now by two coins, coin 1 and coin 2, respectively.

We can then represent all the possible choices involving prize, player and the two remaining doors A and B (after the host has opened the empty door with the prior knowledge about the door being empty) in terms of coins 1 and 2 and their heads and tails.

From the above, prize at door A will be designated as 1H (head in the case of coin 1) and the player at door A as 2H. Similarly, prize at door B will be 1T and the player at door B as 2T.

The conditions for switch to work (leading to a win), i.e. either the prize is at door A (1H) and the player at door B (2T), or the prize at door B (1T) and the player at door A (2H): which in terms of probability, the probability for the switch to work,
P(s) = P(1H)*P(2T) + P(1T)*P(2H) ………….. (Eq. 1)

Similarly, the conditions for the switch to not work (i.e. failing to win), either the prize is at door A (1H) and the player at door A (2H), or the prize at door B (1T) and the player at door B (2T), which in terms of probability for switch not producing a win,
P(f) = P(1H)*P(2H) + P(1T)*P(2T) ……………(Eq. 2).

Since the probability for head or tail for any coin is 1/2, i.e. P(1H) = P(1T) = P(2H) = P(2T) = 1/2.

The probability therefore for the switch leading to a prize (from Eq. 1), P(s) = 1/2;

And the probability switch leading to no win (using Eq. 2), P(f) = 1/2

In other words, there is no advantage in the player making a switch

• Andrew says:

Your analysis takes it's conclusion (that P=1/2) as a premise, and is therefore a tautology. Just because there are two doors left and a coin has two sides doesn't mean that the probabilities of those two things work out the same, as I hoped my original post had clarified.