## Game Theorist — “Dumb is the new Smart”

An interesting article points out that even game theoreticians don’t believe their own results (when money is on the line).

a research team repeated the experiment using professional game theorists playing for real money. But even among game theorists, game theory failed

One hypothesis is that you can get good results by playing dumb. If your opponent knows you are totally rational, then they have to give up a lot to keep from getting screwed. (This particular example deals with the Traveler’s Dilemma, but it applies to the Prisoner’s Dilemma, as well).

I remember a book that dealt with various puzzle aspects of Game Theory as told by Sherlock Holmes, et al. One passage discussed the prisoner’s dilemma, after a clever person tries to use it with real prisoners. When it doesn’t work, he goes to Holmes, who then sighs and calls forth one of the prisoners.

“So you know that it’s always better [to defect].”

“Yes, guv’ner.”

“Then pray explain to [this doofus] why you don’t.”

“Me mates would beat me senseless.”

Glad to see the theoreticians catching up.

Now to just figure out how this relates to unconvincing cylons, and the applications will be endless!

If I played the Travelling Problem, I would absolutely say either $100 or $99, and would consider it stupid to do otherwise.

Its a probabilistic risk problem to me.

If I say $2, I am guaranteed to get either $2 or $4.

If I say $99 or $100, then I will get between $99-101 if the other play also does the same, Or I will get $X-2 if they say X (some lower number).

Lets say that even if I assume there is a 90% chance they will be “rational” and say $2, and a 10% chance of them saying 90-100. My expectation value is still higher by saying 99 or 100.

If you believe that most people will follow a probabilistic model, instead of using the nash equilibirum answer, then only saying 99-100 makes sense.

AlexfrogDecember 11, 2008 at 3:14 pm

But wait – given the problem as stated, surely the “rational” choice is for both participants to bid $100. If both bid $100, each one gets $100. It doesn’t make rational sense to bid down to $2 in order to score the extra $2… The only way that would be rational even with game theory is if you set the “winning” condition to be “I made at least as much if not more than you”.

SeanPDecember 12, 2008 at 12:42 pm

The idea is that if the other player bids $100, then you can bid $99 to increase your winnings to $101. But if he knows that you are bidding $99, he can bid $98 (to win $100), and so forth, iterated down to $2. One problem is that I doubt anyone in real life iterates more than two or three times. Also, as Alex says, you need to look at the big picture. There is only ONE situation in which defecting gains you anything, which is if you manage to bid exactly $1 less than the other guy. Thus defecting will rarely be worth the risk. It’s a case of missing the wood for the trees (which seems to be a common malady of game theory actors).

Kester JarvisDecember 12, 2008 at 5:22 pm

Probabilistically, I htink that $99 is the optimal answer to this question, if you approach it as a risk management problem, instead of trying to determine the Nash Equilibrium.

AlexfrogDecember 12, 2008 at 6:43 pm

Yes, $99 dominates $100, since it always does better or at least as well, no matter what the other fellow does (if he bids $100, I gain a buck; if he bids $99, I gain $2; and if he bids anything else, it’s a wash). So an answer of $99 is justifiable. However, $98 doesn’t dominate $99, unless you assume that your opponent will never bid $100 because it is dominated. However, that leads you on the spiral toward the clearly suboptimal answer of $2. They even have a lable for that kind of reasoning–leaky induction (cool name, huh?). So while it’s not the only answer, I think $99 is the best and probably the most likely value for the game.

Nash Equilibrium can get you into trouble. The ($99,$99) solution for the game isn’t stable, in that one of the players can benefit by choosing $98. The only stable solution is ($2,$2). But in non-zero sum games, unlike zero-sum ones, stable doesn’t mean optimal! It could just be a local maxima, where if

bothplayers stray from the solution, they can both do better. Some people (and maybe even some Game Theoreticians) cling to the belief that Nash Equilibrium = Game Solution, but then again, some people still believe in Santa Claus and the Eternal Bull Market. I just can’t believe that the majority of academics in Game Theory would say that the value of this game is $2.Larry LevyDecember 12, 2008 at 7:43 pm