A Million Dollar Puzzle: The Newcomb Paradox

I’ve put together a new interactive activity at Philosophy Experiments. It’s here:

A Million Dollar Puzzle

If you follow me on Twitter, then you’ll probably already have played through it. Thing is, I find it genuinely baffling, so I’m not sure I have much to say about it, other than see what you think.

You’ll see at the end that people simply don’t agree about the best answer – there is nothing like a crowd-sourced consensus here.

The activity is a version of Newcomb’s Paradox, which you can read about here. Apparently, Robert Nozick once said of the puzzle, "To almost everyone, it is perfectly clear and obvious what should be done. The difficulty is that these people seem to divide almost evenly on the problem, with large numbers thinking that the opposing half is just being silly."

For what it’s worth, I’m inclined towards the response that is slightly the more popular (see final analysis page).

  1. The problem I have, which I am sure others have pointed out, is that if the clairvoyant is 100% accurate, then the only possibilities are:

    Box A&B/Box A&B or
    Box B/Box B.

    In the other cases, the choice does not match the prediction, so the clairvoyant can’t be 100% accurate.

    I assume I am missing something obvious.

  2. Hi Keith

    I’m not sure if you’re missing something obvious! I think some versions of the paradox hold that the clairvoyant is approaching 100% accurate. So that avoids the logical difficulty you’re flagging up. I have it as “for all intents and purposes”, which isn’t quite the same thing, but nearly.

    But I’d need to read the literature to have a view about whether or not you’re missing something. I’m inclined towards the Box B only choice, precisely because of the point you’re making.

    But the alternative argument – that when you come to make your choice everything is already in place, so you can’t possibly be worse off by taking an extra box (since if you take Box A & Box B, then you’ll have Box B, which is all you’d have if you just took Box B), also seems… well right.

    I don’t know! :smile:

  3. Kallan Greybe

    Ask me and we’ve got here is a classic example of conflicting intuitions.

    1. Intuitively causation is forwards in time, but in this case we have a future event causing a past one, both my decision and that of the clairvoyant. That’s classic time travel paradox fare so it shouldn’t be a wonder that our intuitive causal sense isn’t going to work that well to begin with.

    2. The clairvoyant is exercising a judgement and our normal understanding of human judgement is as a statistical probability. When someones says something is certain to happen even in the best cases of human judgement we normally understand them as meaning that it’s just really unlikely that it won’t happen. What makes it tricky is the introduction of certainty so what we have is the clairvoyant acting as a purely causal mechanism. What you’ve actually got are two very different kinds of explanation, causal and probabilistic, just tied together in the problem.

  4. Dianelos Georgoudis

    Assume that the clairvoyant is 100% accurate. Then, clearly, by picking box B one will get more money then by picking both boxes. But it is impossible that by picking one box one will get more money than by picking both boxes. Thus we get a contradiction which proves that the assumption is wrong.

    So we have found out that there cannot, even in principle, be a 100% clairvoyant. But only if free will exists there can’t in principle be a 100% clairvoyant. Therefore free will exists.

  5. I see it a different way.

    You will never get the million bucks because whatever you pick the clairvoyant will ensure that you don’t. The game is rigged.

    If we remove teh spurious clairvoyance aspect, ad rephrase it this way, it becomes very clear.

    “you go into a competition where there are tow boxes one of which always contains $1000, and the other of which may contain a nothing or a million. Before the boxes are loaded, you tell the person loading it what choice you will make, he then try’s to minimize his losses.”

    It is perfectly clear that no one in their right mind would ever load the BOX B with a million unless they knew you weren’t going to pick it.

    Likewise there is no reason you wouldn’t always pick box A.

    If you pick BOX B as well, you will always find it empty.

    The mental confusion comes in trying to accommodate the idea of perfect free will in the same conceptual space as perfect clairvoyance.

    If your perfect free will trumps his perfect clairvoyance occasionally you will get the million.

    On the other hand if he thinks you may, he will simply play safe and never load the Box B at all.

  6. I pick the living cat, as long as it isn’t dead.

  7. Hi,
    this was fun! I took home only box B, but my reasoning was a little more on the pragmatic side.
    It was stated that: “The amount of money in Box B will be determined by Perfect Predictions’s most accurate clairvoyant – for all intents and purposes, 100% accurate”.
    Well, it the prediction is, for all intents and purposes, 100% accurate, then, if I take the bloody box B home and it is empty, i can sue Perfect Predictions and get them to pay me the 1million dollars. So even if the clairvoyant isn´t 100% accurate, it´s their loss, not mine.
    If I, instead took home the two boxes, then, if I only get the 10.000 of box A, I can´t complain.
    ;)

Leave a Comment


NOTE - You can use these HTML tags and attributes:
<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>