It is in no-way rational to be a two-boxer.
Isn’t this a question of hubris? The title claims that “Rational People Do Worse”, but wouldn’t a rational person surmise that there is some information/mechanism they don’t have access to that the computer does in order to achieve the success rate claimed? In that case the rational choice would be to claim the single box.
The other factor not addressed is the $1k vs $1m. The calculus changes a lot based on these amounts. If it was 10 bucks vs the chance to win millions? that’s the lottery. if it were $10k it be a lot harder to walk away from that.
As a thought experiment, its a lot easier to say what you think the answer should be, actually making that choice when there is real money is psychologically different.
Mystery box all the way!
This was a pretty good video, but the supercomputer setup made me kind of meh on it compared to most of the channel’s videos.
It isn’t really a “problem” in my mind because no such computer can or does exist (one that can predict your decision with 99% accuracy). And they hand wave away what the computer might be doing or collecting to make that decision prior to you even knowing what the problem is going to be. I don’t think you can just hand wave that away.
So that “hypothetical supercomputer” is more like an “impossible supercomputer” which ruins this as a thought experiment part of this for me. It’s like saying all-knowing sky fairy/god/budda has made a prediction about your decision, what do you decide?
Well, I’d say I don’t believe in the sky fairy/god/budda and need evidence about the 99% success rate before proceeding.
I guess refusing to engage with the hypothetical is a choice. Personally I think hypotheticals are most interesting and revealing specifically when they are about impossible situations.
Like the question: if you could have any superpower, what would it be?
I would choose the ability to see the future with 99% accuracy just to mess with people by running this box experiment.
I agree with you, but in context.
Meaning, ask me for my superpower and what I’d do with it, sure!
But this channel makes really good science-focused content. So to present this video, which essentially requires an all knowing god-like entity, then try and break down the game theory and probabilities, just seems odd and out of their lane a bit.
The money is the the less interesting bit, the belief in God and outcome is the more interesting one.
If you believe the god bit and play the game:
Take box A&B, get $1.001M: you fooled god, there is free will. Or you are the lucky 1% god can’t predict.
Take box A&B, get $1000: you did not fool god, there is no free will.
Take box B, get $1M: you did not fool god, there is no free will.
Take box B, get $0: you fooled god, there is free will. Or you are the lucky 1% god can’t predict.
If you don’t believe the god bit, then this is just some sort of con man swindle and you are only getting a max $1000 anyway.
So I sorta view this whole thing as religious philosophy, which is why it feels weird on a science channel.
“It’s not trying to trick you”
Yes it is. The whole setup is predicated on being tricked. It’s psychological manipulation from the start, with this talk of how accurately it predicts people’s choices. The setup is meant to intimidate you into agreeing with the computer because it’s so accurate that no rational person would defy it!
Intellectually dishonest/lazy shit like this is why I don’t watch Veritasium videos anymore.




