Imagine two players, named Predictor and Chooser, playing the following game. Chooser is presented with two boxes: an open box containing $1000, and a closed box that contains either $1,000,000, or $0 (he doesn't know which). Chooser must decide whether he wants to be given the contents of both boxes, or just the contents of the closed box. The day before the choice, Predictor will predict whether Chooser will choose 1 box or 2 boxes. If he predicts 1, then he will put $1,000,000 in the closed box, otherwise he will leave that box empty. The question is: should Chooser choose 1 box or 2 boxes?

A game theory analysis is straightforward. If Chooser wants to maximize profit, and Predictor wants to maximize the accuracy of the predictions, then the Nash equilibrium is for Chooser to always take 2 boxes, and for Predictor to always predict that 2 boxes will be chosen. This gives a payout of $1000 and a perfect prediction every time. If Predictor's goal is to minimize payments (rather than maximize prediction accuracy), the equilibrium is the same. If two people played this game repeatedly, they would probably settle into this equilibrium fairly quickly.

Now add an additional assumption: Predictor can see the future. Not just a possible future, but the true, actual future. In other words, Predictor is replaced with a time machine and a robot. Chooser presses one of two buttons marked 1 and 2. The time machine automatically sends this information back in time one day. If a 1 is sent back, then the robot puts $1,000,000 in the closed box. If a 2 is sent back, then the robot cleans out the box and leaves it empty. Now, how should Chooser play?

Once again, the mathematical analysis is simple. If Chooser takes 1 box, then it will contain $1,000,000. If Chooser takes 2 boxes, then the closed box will be empty, and the profit will be only $1,000. Clearly the best choice is 1 box.

However, an argument can be made for choosing 2 boxes. At the time when Chooser walks up to the boxes, the contents have already been set. The closed box is either empty or full. It's too late for the contents of the boxes to change. Chooser might as well take whatever's in both boxes. Whether the closed box is empty or full, he'll clearly make $1000 more by choosing both boxes than by choosing just one box. Causation only goes forward. Events in the future can't cause results in the past, so there can't be any harm in choosing 2 boxes.

Philosophers have proposed many solutions to the paradox that avoid backward causation. Some have suggested that a rational person will choose 2, and an irrational person will choose 1, therefore irrational people do better at this game. Others have suggested that if such time machines can exist, then there is no free will, and Chooser will do whatever he's fated to do. Others have suggested that the paradox itself shows that it's impossible to ever know the future.

Other people have suggested that in a world with time machines, causation *can* go backwards. If a person truly knows the future, and that knowledge affects his actions, then events in the future will be causing effects in the past. If causation can go backwards, then the paradox is straightforward. Chooser can freely pick either 1 or 2. That information will then go back in time and *cause* the closed box to have been empty or full. It's therefore better to choose 1 box rather than 2. If Chooser tries picking 2 instead, he will later discover that his choice caused that box to have been empty all along, and he'll receive less money. This resolves this form of the paradox. However, there is still a modified form of the paradox that is problematic.

Suppose that the closed box is made of glass. Now what should Chooser do? If he sees $1,000,000 in the closed box, then he might as well choose both boxes, and get both the $1,000,000 and the $1,000. If he sees the closed box is empty, he would be angry at being deprived of a chance at the big prize, and so could choose just the 1 box to demonstrate that the game is a fraud. Either way, his actions will be the opposite of what was predicted, which contradicts the assumption that the prediction is always right.

This form of the paradox is equivalent to the grandfather paradox that arises in other forms of time travel. In the grandfather paradox, a person travels back in time, which leads to a chain of events preventing that from happening. In Newcomb's paradox with a glass box, the information about the choice travels back in time, which leads to a chain of events preventing that from happening. The various ways of resolving the two paradoxes are identical.

The paradox with the glass box could be taken as a proof that it is impossible to know the future. Or, perhaps knowledge of the future is only possible in cases where the knowledge itself won't prevent that future. Or, perhaps the universe will conspire to prevent self-contradictory causal loops (via the Novikov self-consistency principle, for example). Chooser might accidentally hit the wrong button, or he might misunderstand the rules, or the time machine might break. See Time travel and grandfather paradox for further discussion.

- Nozick, Robert (1969), "Newcomb's Problem and Two principles of Choice," in Essays in Honor of Carl G. Hempl, ed. Nicholas Rescher, Synthese Library (Dordrecht, Holland: D. Reidel), p 115.
- Gardner, Martin (1974), "Mathematical Games,"
*Scientific American*, March 1974, p. 102; reprinted with an addendum and annotated bibliography in his book*The Colossal Book of Mathematics*(ISBN 0-393-02023-1) - Campbell, Richmond and Lanning Sowden, ed. (1985),
*Paradoxes of Rationality and Cooperation: Prisoners' Dilemma and Newcomb's Problem*, Vancouver: University of British Columbia Press. (an anthology discussing this paradox, with an extensive bibliography) - Levi, Isaac (1982), "A Note on Newcombmania,"
*Journal of Philosophy*79 (1982): 337-42. (a paper discussing the popularity of this paradox) - Web pages about Newcomb's paradox