Newcomb's problem
-
- Posts: 10339
- Joined: June 15th, 2011, 5:53 pm
Re: Newcomb's problem
- amplified cactus
- Posts: 26
- Joined: December 29th, 2019, 6:00 pm
Re: Newcomb's problem
It's not a new condition. I was simply presenting how a two-boxer would probably describe the situation. Once you're in the room, the boxes are set. That's the sense in which your decision can't make any difference. Once you're asked to make a decision, the money is either there or it isn't; at that point, you have no causal influence over the contents of box B.Steve3007 wrote: ↑December 30th, 2019, 7:16 pmThis is a new condition that wasn't stated in the original wording of the problem. With this new condition the predicting power of the predictor is irrelevant to the problem, as far as I can see, because the only relevant action that she could take, based on her predictive knowledge, is now unavailable to her.amplified cactus wrote:Why two-box?
Once you enter the room, the content of B is fixed. The Predictor makes her prediction and sets the box before you arrive. Your decision can't make any difference to what's in the boxes.
- Felix
- Posts: 3117
- Joined: February 9th, 2009, 5:45 am
Re: Newcomb's problem
Yes, if you know that the Predictors prediction of your decision is based on her psychological/behavioral analysis of you, and she cannot change her action (depositing the money) after you enter the room, than doing the opposite of what you are inclined to do would practically guarantee you'd win the game, e.g., if you would be inclined to pick both boxes, than pick box B instead.Steve3007: This is a new condition that wasn't stated in the original wording of the problem. With this new condition the predicting power of the predictor is irrelevant to the problem, as far as I can see, because the only relevant action that she could take, based on her predictive knowledge, is now unavailable to her.
You've contradicted yourself. You said before that if the Predictor believes I will choose both boxes, she won't put £1,000,000 in Box B.amplified cactus: It's not a new condition. I was simply presenting how a two-boxer would probably describe the situation. Once you're in the room, the boxes are set. That's the sense in which your decision can't make any difference.
- amplified cactus
- Posts: 26
- Joined: December 29th, 2019, 6:00 pm
Re: Newcomb's problem
Yes, if Predictor predicts that you pick both, then she will leave box B empty. Where's the contradiction? The point is that once you're actually in the room, the prediction has already been made. Nothing you do at that point will causally influence what's in the boxes.Felix wrote: ↑December 30th, 2019, 9:15 pmYou've contradicted yourself. You said before that if the Predictor believes I will choose both boxes, she won't put £1,000,000 in Box B.amplified cactus: It's not a new condition. I was simply presenting how a two-boxer would probably describe the situation. Once you're in the room, the boxes are set. That's the sense in which your decision can't make any difference.
- h_k_s
- Posts: 1243
- Joined: November 25th, 2018, 12:09 pm
- Favorite Philosopher: Aristotle
- Location: Rocky Mountains
Re: Newcomb's problem
To avoid unnecessary risks, you must avoid getting into cars when you do not absolutely need to.amplified cactus wrote: ↑December 30th, 2019, 6:21 pm"Always minimize all risk of any kind" seems ridiculous to me, and I don't see how anybody could actually live in accordance with such a principle. Do you avoid getting in cars due to the risk that you might crash? Do you avoid using stairs due to the risk that you might trip and fall? This sounds pathological.h_k_s wrote: ↑December 30th, 2019, 4:41 pmA fundamental principle of dealing with risk is always to minimize all risk of any kind. This is a personal philosophy dealing with self preservation however.
Therefore if taking both boxes minimizes the risk that you will go away empty handed, that is clearly the best approach philosophically speaking.
The only truly philosophical principle involved is risk avoidance.
Game theory on the other hand teaches you how to play games, which involves taking risks.
But philosophically, you should never take any unnecessary risk, and you should always eliminate all risks that can be eliminated whenever practical to do so.
There is a big difference between game theory and philosophy.
Philosophy focuses on what is best.
Game theory focuses on winning at the risk of losing. Most people first learn about game theory in high school math class.
Philosophy is normally not taught until college or else in an expensive prep school. In Catholic high schools you are taught about Augustine and Aquinas and the other Romantic philosophers and their philosophies. These all deal with God and God-ness.
In any case, assuming your goal is to gain wealth, two-boxing does involve a risk: the risk that you will lose out on getting the £1,000,000. Minimizing the risk of going away empty handed will significantly increase the risk of going away without the £1,000,000.
I have no idea what you're talking about with comments like "The only truly philosophical principle involved is risk avoidance" and "philosophically, you should never take any unnecessary risk". You seem to be using the term "philosophy" in a very unusual way.
Cars kill more people than does anything else on Earth except mosquitoes.
Once you fully understand this, then you are getting smarter, philosophically speaking.
-
- Posts: 10339
- Joined: June 15th, 2011, 5:53 pm
Re: Newcomb's problem
Yes, it is ridiculous. It is the "I will never get out of bed" principle. Nobody could or does live by it. If there were any one fundamental principle with regard to risk then it is to minimize the risk/reward ratio. But, of course, there isn't. Different people have different attitudes to risk. For the purposes of this puzzle I'm assuming that my motivation is to maximize the probability of gaining the maximum amount of money.amplified cactus wrote:"Always minimize all risk of any kind" seems ridiculous to me, and I don't see how anybody could actually live in accordance with such a principle.
-
- Posts: 10339
- Joined: June 15th, 2011, 5:53 pm
Re: Newcomb's problem
Ok. This part of the wording of the OP: "Your decision can't make any difference to what's in the boxes" seemed to be saying that the one relevant action the predictor can take, as a result of her knowledge about me, was forbidden to her. I presume what you meant to say was that my decision can and does make a difference to what's in the boxes (as stated in the original wording of the problem), but the predictor knows, before I arrive, what that decision will be.amplified cactus wrote:It's not a new condition. I was simply presenting how a two-boxer would probably describe the situation.
If, as stated in the original wording of the problem, the predictor "has the power to predict human behaviour with nearly perfect accuracy" then it doesn't matter if she has already put/not put the £1,000,000 in box B before I arrive. She could have either done or not done that before I was born. She could have done it 1,000,000 years ago. It would make no difference. She knows, even before I am born, that I will take box B. Therefore she puts £1,000,000 in box B. Therefore I take only box B.I was simply presenting how a two-boxer would probably describe the situation. Once you're in the room, the boxes are set. That's the sense in which your decision can't make any difference. Once you're asked to make a decision, the money is either there or it isn't; at that point, you have no causal influence over the contents of box B.
I presume the intention of the puzzle is to create what appears to be an absurd loop of causality once we propose the existence of an omniscient being who (we propose) can predict everything that any person is going to do at any point in the future, but we also keep the notion that our decisions are manufactured at the point when we decide them. It's similar to the puzzles that arise in considering travel backwards in time. And obviously it's similar to some puzzles which arise when proposing the existence of an omniscient God.
-
- Posts: 10339
- Joined: June 15th, 2011, 5:53 pm
Re: Newcomb's problem
If we're talking in terms of causality and we don't believe that effects can precede causes then we can put it like this: The predictor knows all causal chains in the Universe. She therefore knows all the causal chains that lead me to make the decision that I make, whatever that is. In that sense, she's a bit like the birdlike entity in the last book of the Hitchiker's Guide to the Galaxy. Therefore my decision, when I'm in the room, does not causally affect what she does, but that decision is itself caused by various things that were happening in the past when she made her decision as to whether to deposit the million. So the act of her depositing the million and my decision have a common cause. In that sense, they are causally linked.amplified cactus wrote:The point is that once you're actually in the room, the prediction has already been made. Nothing you do at that point will causally influence what's in the boxes.
- Terrapin Station
- Posts: 6227
- Joined: August 23rd, 2016, 3:00 pm
- Favorite Philosopher: Bertrand Russell and WVO Quine
- Location: NYC Man
Re: Newcomb's problem
- Sculptor1
- Posts: 7091
- Joined: May 16th, 2019, 5:35 am
Re: Newcomb's problem
The hypothetical cannot work unless she chooses the same way each time.amplified cactus wrote: ↑December 30th, 2019, 6:27 pmThe only information you have when you're faced with the decision is that the Predictor is almost perfect. You know that billions of other people have been in the room before you, and that almost all of the one-boxers got the £1,000,000, while almost all the two-boxers did not. Nobody knows exactly how the Predictor works.Sculptor1 wrote: ↑December 30th, 2019, 6:08 pm
It does matter, since knowing how she works would inform whether or not I thought she would predict one way or the other.
And since she simply could not know how I would chose, since she cannot know ME, then her capacity to act in this respect is absurd.
Inevitably my decision has to be based on my knowledge of her method, and by extension her decision about how "humans" whatever the F*ck that is suppose to be.
Yes, it's absurd. Presumably it could never happen in reality. Again, it's a hypothetical. If you find it uninteresting or silly or whatever, you don't have to answer.
Since she has no knowledge of who is going to make the next choice, her decision cannot change since she has no grounds for making a different choice.
Now you tell us that "that almost all of the one-boxers got the £1,000,000, while almost all the two-boxers did not." - which makes a complete mockery of the entire scenario.
If you want to offer a hypothetical then please offer one that makes sense. This one is nonsense.
- amplified cactus
- Posts: 26
- Joined: December 29th, 2019, 6:00 pm
Re: Newcomb's problem
[/quote]Steve3007 wrote: ↑December 31st, 2019, 4:48 amOk. This part of the wording of the OP: "Your decision can't make any difference to what's in the boxes" seemed to be saying that the one relevant action the predictor can take, as a result of her knowledge about me, was forbidden to her. I presume what you meant to say was that my decision can and does make a difference to what's in the boxes (as stated in the original wording of the problem), but the predictor knows, before I arrive, what that decision will be.amplified cactus wrote:It's not a new condition. I was simply presenting how a two-boxer would probably describe the situation.
I meant to say exactly what I said. Again, I was simply presenting how most two-boxers think of the situation. Two-boxers tend to say that once you are in the room, nothing you do can make any difference to what's in the boxes. Why? Because, they would say, in order to make a difference to X, you need to be able to causally influence X. Once you are in the room, you have no causal influence over the contents of box B. Think of it this way: if you take the money from box A, what would happen to box B? Nothing, obviously. The money is either in there or it isn't. Nothing you do can change that now. So why not take the extra £1,000? That's how the two-boxer thinks about this problem.
The puzzle is intended simply as a thought experiment to probe the underlying principles of (rational) decision making. In particular, it gets a lot of discussion in the debate between evidential decision theory and causal decision theory.Steve3007 wrote: ↑December 31st, 2019, 4:48 amI presume the intention of the puzzle is to create what appears to be an absurd loop of causality once we propose the existence of an omniscient being who (we propose) can predict everything that any person is going to do at any point in the future, but we also keep the notion that our decisions are manufactured at the point when we decide them. It's similar to the puzzles that arise in considering travel backwards in time. And obviously it's similar to some puzzles which arise when proposing the existence of an omniscient God
- amplified cactus
- Posts: 26
- Joined: December 29th, 2019, 6:00 pm
Re: Newcomb's problem
Where did you get that from? She does know who is going to make the next choice. She makes predictions for each particular individual who enters the room. So she has a list of predictions such as: "Frank will one-box", "Vincent will one-box", "Bob will two-box", etc., and she sets the boxes accordingly. Presumably, she is near-omniscient and uses her information about each person's psychological traits and decision-making processes to make her predictions about what their decisions will be. But this really isn't important. All that matters is that she has made a prediction about what you will choose, and she is near-perfect at predicting people's behaviour. It doesn't matter how her predictions are made.Sculptor1 wrote: ↑December 31st, 2019, 12:34 pm...amplified cactus wrote: ↑December 30th, 2019, 6:27 pm
The only information you have when you're faced with the decision is that the Predictor is almost perfect. You know that billions of other people have been in the room before you, and that almost all of the one-boxers got the £1,000,000, while almost all the two-boxers did not. Nobody knows exactly how the Predictor works.
Yes, it's absurd. Presumably it could never happen in reality. Again, it's a hypothetical. If you find it uninteresting or silly or whatever, you don't have to answer.
Since she has no knowledge of who is going to make the next choice, her decision cannot change since she has no grounds for making a different choice.
...
- Felix
- Posts: 3117
- Joined: February 9th, 2009, 5:45 am
Re: Newcomb's problem
As I said, after entering the room, so as to fool the predictor, I reversed my decision. At the last second, instead of choosing both boxes as I had intended, I chose Box B. But the predictor didn't believe I would do this and so left box B empty. Will she cheat me out of my 1 million pounds?amplified cactus: Yes, if Predictor predicts that you pick both, then she will leave box B empty. Where's the contradiction? The point is that once you're actually in the room, the prediction has already been made. Nothing you do at that point will causally influence what's in the boxes.
- amplified cactus
- Posts: 26
- Joined: December 29th, 2019, 6:00 pm
Re: Newcomb's problem
The Predictor is very likely to have predicted that you would reverse your decision. Whatever process you use to arrive at the decision to pick one box, the Predictor is very likely to have predicted it, and will therefore have put £1,000,000 in the box.Felix wrote: ↑January 1st, 2020, 7:54 pmAs I said, after entering the room, so as to fool the predictor, I reversed my decision. At the last second, instead of choosing both boxes as I had intended, I chose Box B. But the predictor didn't believe I would do this and so left box B empty. Will she cheat me out of my 1 million pounds?amplified cactus: Yes, if Predictor predicts that you pick both, then she will leave box B empty. Where's the contradiction? The point is that once you're actually in the room, the prediction has already been made. Nothing you do at that point will causally influence what's in the boxes.
By stipulation, the Predictor is nearly perfect. You can't "cheat" her by changing your mind about things, because she will almost always correctly predict those changes of mind.
-
- Posts: 10339
- Joined: June 15th, 2011, 5:53 pm
Re: Newcomb's problem
As I understand it, if I do the opposite of what I was inclined to do, the Predictor already knew (with near certainty) that I was going to do the opposite of what I was inclined to do. Whatever I decide to do, for whatever reason, no matter now spontaneous I think I'm being, the Predictor knew before I entered the room what I was going to do. Even if I toss a coin and base my decision on that, the Predictor knew beforehand that I was going to do that and knew whether it would be heads or tails. Even if I base my decision on a truly random quantum event, She knows. I presume the Predictor is the kind of lady who knows whether a cat is alive or dead before the box is opened. What a Gal!Felix wrote:Yes, if you know that the Predictors prediction of your decision is based on her psychological/behavioral analysis of you, and she cannot change her action (depositing the money) after you enter the room, than doing the opposite of what you are inclined to do would practically guarantee you'd win the game, e.g., if you would be inclined to pick both boxes, than pick box B instead.
Ok.amplified cactus wrote:I meant to say exactly what I said. Again, I was simply presenting how most two-boxers think of the situation.
Ok. What I would say to those two-boxers is what I said earlier. Even if the Predictor made the decision as to whether to put £1,000,000 into box B before they were born, she did so based on near-certain knowledge of what they would do once they'd been born, grown up and gone into that room. The decision taken in the room and the decision taken earlier by the Predictor have a common cause. That is the sense in which they are causally linked. The scenario presented in the OP dictates that the Predictor knows what I'm going to do before I decide to do it and can make decisions based on that knowledge. That may or may not be a realistic scenario (depending on whether one believes that there could be such a creature as an omniscient being capable of making decisions that affect the material world) but that is the scenario we're presented with and which we are asked to consider.Two-boxers tend to say that once you are in the room, nothing you do can make any difference to what's in the boxes. Why? Because, they would say, in order to make a difference to X, you need to be able to causally influence X. Once you are in the room, you have no causal influence over the contents of box B.
Because if I do that the Predictor already knew I was going to do it so she refrained (an arbitrarily long time before I entered the room) from putting £1,000,000 in box B. So I'll be left with only £1000. Not even enough for a new car! Obviously this has odd consequences for my view of my own free will. It makes me think that my decision, even though I may believe myself to have manufactured it from no previous cause there and then, in that room, was, in some sense, not really mine. It was the pre-determined end-effect of a causal chain. It might make me apt to believe in notions like fate. But that's simply a consequence of contemplating the curious idea of the existence of omniscient beings who can affect the material world.Think of it this way: if you take the money from box A, what would happen to box B? Nothing, obviously. The money is either in there or it isn't. Nothing you do can change that now. So why not take the extra £1,000?
If the two-boxer fully appreciates (and believes) what has been said about the Predictor then that seems to me a curious attitude to take. Obviously, though, I would understand it if the two-boxer simply doesn't believe what he/she has been told about the Predictor. If this were the real world, rather than a thought experiment, I wouldn't either! Obviously we suspend our disbelief (as we do when engrossed in any good work of fiction) for the sake of analysing the problem as stated.That's how the two-boxer thinks about this problem.
Fair enough. I enjoyed the puzzle. But I think the god-like predictive powers of the Predictor more obviously address issues to do with omniscient gods and determinism. And, as I said, it also reminds me of the standard paradoxes which arise if we consider travel backwards in time to be possible. I'm reminded particularly of the film "Bill and Ted's Bogus Journey". There is a sense in which the backward time traveller, having already seen the future, has this Predictor-esque omniscience. A backward time traveller could simply observe what I do then go back in time, taking that knowledge with them, and decide whether to put the money in the box.The puzzle is intended simply as a thought experiment to probe the underlying principles of (rational) decision making. In particular, it gets a lot of discussion in the debate between evidential decision theory and causal decision theory.
As I understand it, the difference between evidential and causal decision theory is essentially the difference between correlation and cause. Evidential decision theory states that we simply look at which decisions have tended in the past to be correlated with which outcomes and base our decisions on that evidence. We only consider those correlations, and not whether they indicate a causal connection. In causal decision theory we look for actual causal connections.
I'm not sure if the puzzle sheds any light on this because there is a causal connection there even if I change my mind while in the room. My decision, taken in that room, is causally connected to the question of whether there is already £1,000,000 in box B. It has a common cause with the decision as to whether to place that money in box B some time previously.
2023/2024 Philosophy Books of the Month
Mark Victor Hansen, Relentless: Wisdom Behind the Incomparable Chicken Soup for the Soul
by Mitzi Perdue
February 2023
Rediscovering the Wisdom of Human Nature: How Civilization Destroys Happiness
by Chet Shupe
March 2023