site banner

Friday Fun Thread for September 6, 2024

Be advised: this thread is not for serious in-depth discussion of weighty topics (we have a link for that), this thread is not for anything Culture War related. This thread is for Fun. You got jokes? Share 'em. You got silly questions? Ask 'em.

2
Jump in the discussion.

No email address required.

Before you are three closed, identical boxes. The first box contains two silver coins and nothing else. The second box contains two gold coins and nothing else. The third box contains two coins (one gold and one silver) and nothing else. You have no idea which box is which, you cannot see inside the boxes, and they are mounted to the wall so you cannot lift them to estimate their weight.

You reach your hand into one of the boxes and withdraw a gold coin. If you reach into the same box to withdraw the second coin, what is the probability that that coin is also gold?

It's 66%. You know that the box you withdrew the first coin from can't be the box with two silver coins in it, so it must be one of the other two boxes. There are exactly three coins remaining in these two boxes combined, one of which is silver and the others gold. Ergo the odds of you withdrawing a second gold coin are 2/3.

This puzzle was shared on a Facebook meme page I follow. I'm not trying to flex or anything, but I solved it instantly and the solution seems incredibly obvious to me. I was very surprised to see the comments full of people asserting that the answer is 50%. There's even a Wikipedia article about it, and it's referred to as a "paradox". One of my pet peeves is when the word "paradox" is used to refer to mathematical problems with counterintuitive solutions, or counterintuitive findings from the sciences - as opposed to contradictions in logic. Russell's paradox is a legitimate paradox: there is no good answer to the question "if a barber only shaves men who do not shave themselves, does he shave himself?" The "twin paradox" in general relativity isn't actually a paradox, but I can see how it runs counter to our human intuition of how things work in Mediocristan. But in this case, I don't think this particular puzzle even rises to the level of "mathematical problem with a counterintuitive solution": the solution seems incredibly obvious and straightforward. The word "paradox" gets used far too freely.

Bayes to the rescue: If you pull a gold coin from a box, that is strong evidence that the box is pure gold (because with a gold box, you will always get that measurement), neutral evidence that it is mixed (because with a mixed box you /can/ get that measurement) and rules out that you have the pure silver box.

If you start with a uniform prior, then you should end up with 2/3 pure gold, 1/3 mixed, which will give the probability you said.

So what would you say is your probability of withdrawing a gold coin if everything else is the same, but the third box has one gold coin and 10 silver coins, instead of just one gold and one silver coin?

In that case, since we know he drew one gold coin, it also rules out Box 1, so we’re left with Box 2 and 3. We know that initially there were 13 coins here (3 gold, 10 silver) so now there are 12 coins left (2 gold, 10 silver). So 2 gold out of 12 remaining coins = 1 in 6 chance = 16.66%?

[Of course, the chances of him plucking a gold coin in the first place would have been much less than the 50% in the first question – it would actually have been just 20% 2 silver 2 gold 1 gold, 10 silver = 3 gold vs 12 silver = 20% But since we know he did actually pick a gold coin first, the chances of him also picking a gold coin second are 16.66%.]

Of withdrawing another gold coin? 2/3

Why 2/3?

OP's intuition says that once you pick one gold coin, you know that you have one of two boxes, and that there are exactly 12 coins in those two boxes combined, two of which are gold, so that would put the probability of getting another gold coin at 2/12.

Wait, I am wrong.

The probability of picking the two-gold box is 1/3. The probability of picking the mixed box and finding gold on the first draw is 1/3*1/11=1/33.

11/33 vs 1/33, I am 11 times more likely to find another gold coin, not two times.

Because you know that you picked gold initially. The odds of the second coin being gold is the odds that you didn't pick 1/3 boxes with with both gold and silver coins, meaning 2/3. The only way the second coin isn't gold is that the initial choice was the box with both silver and gold coins in it, the number of silver coins in that box do not matter because of the precondition of having picked a gold coin.

Of course they matter, they increase the chance that the gold picked in round one was from the double gold box dramatically, which itself hugely increases the odds of round 2 is also gold.

They dont matter because the question is conditioned on that we already picked a box with a gold coin.

The question is what the odds are that we picked the box with both gold and silver, given that we have a box with at least a gold coin in it. There is 1/3 with gold and silver, hence the probabilty of the second coin being gold is 2/3. You could increase the amount of silver coins by infinity and it wouldn't matter. You're picking boxes, not coins.

Yes. But the fact that w already picked a box with the gold coin tells us that it was almost certainly the double gold box and therefore that the probability of the gold second coin is even higher.

No, it tells us nothing. The question is conditioned on a gold coin having been picked.

We didn't pick a box at random, the gameshow host did and revealed a gold coin.

More comments

I think this checks out but someone else will need to check for me.

It's probably good that you're not trying to flex too much about how smart you are due to finding the solution to this problem incredibly obvious, because it seems that you got the answer correct the same way that a broken clock gets the time correct twice every day. ;)

Okay, rude.

I disagree. Maybe this is the reason I "always forget" the simple route; because I'm not sure it's actually right. I did this two different ways, my renormalization route (thinking of things as a tree with info sets) and just brute reproducing the wiki entry on using Bayes to solve it.

Method 1: Renormalization

There's a 1/3 chance of picking each box, one which has a 100% chance of giving you a gold on the first draw and the other has a 1/11 chance (ignoring the option with zero chance of getting a first gold), so the chances of me being in each relevant box at the current state are 1/3 and 1/33. To renormalize, I need to multiply by the reciprocal of their sum, 1/3 + 1/33 = 12/33.

So my chance of being in the GG box is 11/12 and my chance of being in the G10S box is 1/12.

Method 2: Straight Bayes, yo

Just shutting up and calculating, reproducing the wiki article directly.

P(GG|see gold) = P(see gold|GG)*(1/3) / [P(see gold|GG)*(1/3) + 0 + P(see gold|G10S)*(1/3)]

P(GG|see gold) = (1/3) / (1/3 + 0 + 1/33)

= (1/3) / (12/33)

= 11/12

I'd say law of conditional probability is the simplest route here. P(2nd Coin Gold | 1st Coin Gold) = P(Both Gold) / P(1st Coin Gold) = 1/3 / (1/3 * 1 + 1/3 * 1/11) = 1/3 / (1/3 + 1/33) = 1/3 / (12/33) = 1/3 * 33/12 = 11/12.

This is obviously correct, I have no idea what the other people are saying.

Two intuitive ways to formulate this come to mind.

First, law of conditional probability. P(A | B) * P(B) = P(A and B), or alternatively, P(A | B) = P(A and B) / P(B). P(2nd Coin Gold | 1st Coin Gold) = P(Both Coins Gold) / P(1st Coin Gold). P(1st Coin Gold) is 50% or 1/2, the probability of selecting the box with both gold coins (1/3) plus the probability of selecting the mixed coin box times one half (1/3 * 1/2 = 1/6). Probability both coins are gold is 1/3, as it can only occur if you pick the one box of three with two gold coins. 1/3 / (1/2) = 2/3.

Second, just thinking of all the possible outcomes before any coins are picked. Let's arbitrarily designate one coin in each box X, and the other Y. So boxes one to three contain coins Gold X and Gold Y, Gold X and Silver Y, and Silver X and Silver Y, respectively. There are only six possible outcomes for first and second pick, comma separated:

  1. Gold X, Gold Y
  2. Gold X, Silver Y
  3. Silver X, Silver Y
  4. Gold Y, Gold X
  5. Silver Y, Gold X
  6. Silver Y, Silver X

If you picked gold first, you know you can't be in outcomes 3, 5, or 6. Out of outcomes 1, 2, and 4, two of those have gold as the second pick. Thus, 2/3 once again.

I’m convinced this is easy to explain to pretty much anyone, just by making clear that the fact that you chose gold first makes the double-gold box more likely than the mixed box because coins of the same type are fungible.

Like the Monty hall problem, people who make the mistake do so not (necessarily) because they’re stupid, but because they just haven’t been taught to think carefully about the “given that” aspect of these probability questions.

If you pick a gold coin in box A, you have a 100% chance of getting a second gold. If you pick a gold coin in box B, you have a 0% chance of getting a second gold. (100+0)/2 does indeed equal 50. But the first gold was clearly twice as likely to come from the double-gold bucket as the mixed bucket, which means that the chance of a second gold is obviously also higher.

I got the right answer, but I always forget the simple way of thinking about it that you mentioned and do these problems the hard way.

There's a 1/3 chance of picking each box, one which has a 100% chance of giving you a gold on the first draw and the other has a 50% chance (ignoring the option with a zero percent chance of getting a first gold), so the chances of me being in each relevant box at the current state are 1/3 and 1/6. Renormalize, and you get a 2/3 chance of getting another gold. I think this renormalization reasoning works for these particular problems, but I'd probably have to sit with Bayes rule for a minute to convince myself that it does generalize. I've been doing game-theoretic information sets on extensive form games more recently, so I'm picturing a tree in my mind and an information set across states.

I almost said 50%, but 2/3 is easy to prove by enumerating every possibility: uppercase coins are gold, lowercase coins are silver:

  • boxes contain AB, Cd, ef
  • there are six potential outcomes:
    • A, then B
    • B, then A
    • C, then d
    • d, then C
    • e, then f
    • f, then e
  • the last three are not what happened, so you're in one of the first three potential timelines, two of which result in a second gold coin

I prefer the boy or girl paradox, which is much less straightforward.

So, if you treat the coins as not being fungible, this makes sense. But they are fungible? So why wouldn't it be 50%? The question isn't about pulling a specific gold coin, but any gold coin. Like I'm pretty sure I could bang out a quick script that will run this 1000, or 10,000 times, or however many you want, and the observed results will be 50% and not 66%.

Edit: Huh, I'll be damned, it is coming out 66%

"Fungible" is misleading. There are 3 boxes, therefore each box has 1/3 of the possible outcomes. If you start with that, everything else falls into place.

function randint(n) {
	return Math.floor(Math.random()*n);
}

function draw_twice() {
	let boxes = [[0,0],[0,1],[1,1]];
  let box = boxes[randint(3)];
  let first_coin = randint(2);
  let second_coin = 1 - first_coin;
  if (box[first_coin] == 1) {
  	return box[second_coin];
  } else return -1;
}

cases = [0,0,0];
for (let i = 0; i < 10000; i++) {
	cases[draw_twice()+1]++;
}
console.log('Silver picked first: '+cases[0]+' times, gold->silver: '+cases[1]+' times, gold->gold: '+cases[2]+' times');

Console output:

"Silver picked first: 4893 times, gold->silver: 1731 times, gold->gold: 3376 times"

Note that if I drew the coin with box.pop(), I'd get 50% because I'd only be drawing the gold coin from [0,1] every time.

Yeah, mine was a bit different.

class box
{
    public bool[] coins = new bool[2];
}
class Program
{
    static void Main(string[] args)
    {
        box[] boxes = new box[3] { new box(), new box(), new box()};
        boxes[0].coins[0] = true;
        boxes[0].coins[1] = true;
        boxes[1].coins[0] = true;
        boxes[1].coins[1] = false;
        boxes[2].coins[0] = false;
        boxes[2].coins[1] = false;

        int discard_count = 0;
        int firstcoin_count = 0;
        int secondcoin_count = 0;

        Random rand = new Random();

        for(int i = 0; i < 100000; i++)
        {
            int boxnum = rand.Next(0, 3);
            int coinnum = rand.Next(0, 2);

            bool firstcoin = boxes[boxnum].coins[coinnum];
            if (firstcoin)
            {
                firstcoin_count++;
                bool secondcoin = boxes[boxnum].coins[(coinnum + 1) % 2];
                if (secondcoin)
                {
                    secondcoin_count++;
                }
            } else
            {
                discard_count++;
            }
        }
        Console.WriteLine(string.Format("discard_count = {0}", discard_count));
        Console.WriteLine(string.Format("firstcoin_count = {0}", firstcoin_count));
        Console.WriteLine(string.Format("secondcoin_count = {0}", secondcoin_count));
        Console.WriteLine(string.Format("chance of second coin given first coin = {0}", (double)secondcoin_count / (double)firstcoin_count));
    }

With output of

discard_count = 49935

firstcoin_count = 50065

secondcoin_count = 33440

chance of second coin given first coin = 0.6679316888045541

1/2, because given that I've withdrawn a gold coin, the only possibilities are that I'm drawing from either 1g1s or 2g, so after I've drawn for the first time, the box I'm drawing from is either 1s or 1g. Equal chance between those. If we work through this from the very beginning, I have a 1/3 chance to pick either of the three boxes, but the p(1/3) case to draw from 2s is discarded according to the premise.

I see where my mistake is after reading the explanation: I've assigned separate cases to each box rather than each coin.

I made the same mistake.

Your mistake is that upon drawing a gold coin, you concluded you’re equally likely to have drawn from 1g1s and 2g. The latter is twice as likely as the former. Interesting that more people would have understood this if the numbers were 100s, 99s1g, and 100g.

1/3? Unless I'm missing something, it's another variant of the Monty Hall thing.

Damn. I'll claim half a point for getting the opposite of the right answer.