Hacker News new | past | comments | ask | show | jobs | submit login

There's a concept for this called utility. It has nothing to do with how many times you get to play, but rather with how much use you have for the money.

If the $10000 means little to you, you might even take a negative expected value for a chance at a big score (people do this every day with $1 lottery tickets), regardless of how many times you get to play.

Edit: The reason that repeating the game has some allure is that you are changing the game from a low probability of winning a large payout to a higher probability of winning a distribution of payouts computable via the binomial expansion. It also changes the utility equation.




Exactly. This is the economic principle of "marginal utility" -- see http://en.wikipedia.org/wiki/Marginal_utility

The principle is that I'm willing to buy a lottery ticket, even though the odds are slanted (the agency takes a large cut) because the $1 the ticket costs me is meaningless to my life, but the $1 million I might win would fundamentally change my life.

The is why people buy insurance: the $1000 I spend every year on homeowner's insurance hurts only slightly, but the amount that I'd lose to a fire would destroy me financially.

It's also the rationale for burglary: the thief is very likely to make a large sum of money, while he only has a small chance of a devastating loss (i.e., going to jail).

The question is all wrapped up in the relative size and likelihood of the events, where the size is judged relative to the difference that it makes to you.


If "repeating the game ... changes the utility equation" from negative to positive, how can you possibly say "it has nothing to do with how many times you get to play"?


It's be a mistake to say it has nothing to do with that, but the number of times you get to play is subsumed into the utility equation. It would be irrelevant if utility was linear.

The point is that $10,000 to me is worth more than 1% of the value to me of $1,000,000. Whereas I would probably pay $1.10 for a 1% chance of winning $100, though not on a regular basis.


"negative to positive" - those are your words.

If utility dictates that I should not risk $10,000 on a 99% chance of losing it all, it is unlikely that I have the bankroll to play the game as many times as would be required to overcome my risk of losing everything. And by playing multiple times, I have introduced a scenario with significant probability of losing much more than $10,000, which is obviously catastrophic if losing $10,000 means so much to me.

The utility is certainly different, but I wouldn't necessarily claim it is better. It could in fact be worse, but I haven't studied it enough to assert either.


Exactly. It's important to remember that, while in theory you might win a particular game on average, that game does not exist in a vacuum. Suppose you're playing a game wherein the gambling is only a component, and you also have some other use for the currency (say, it's the metric that causes you to win). The player in the lead should typically not bet all his money on any gamble, no matter how good the expected payoff.

edit: Of course, there are exceptions to that last rule of thumb. It's just a common case.


While one can fold this type of difficulty into expected utility theory, this is not (apparently) true of all situations. People still overweight loss, overweight certainty, and overweight extremely low probabilities.

The classic paper by Daniel Kahneman and Amos Tversky presents a pretty thorough critique of expected utility theory as a model for decision making, as an empirical question. c.f. www.hss.caltech.edu/~camerer/Ec101/ProspectTheory.pdf




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: