-->

Monday, 2 September 2013

Not Prospect Theory

When reading the chapter on Prospect Theory in "Thinking, Fast and Slow" one of the things that struck me was that if you use the reference point for gains and losses as the point you end up at then you would have a pretty good model. For example if you have a hundred and lose 20 your loss is 20/80 but if you win 20 your gain is 20/120.

The formulas for this is x/(x+worth): Plots are done in Wolfram Alpha.


Not sure if this is important yet but here is the derivative:


As opposed to Prospect Theory which looks like this:


One thing I didn't like about Prospect Theory is the steep curve for a loss this does not explain why people are so ready to buy insurance. If you use the above model if you could lose 90% of your value then -90/(-90+100) gives -9 and if you have to pay 1% then -1/(-1+100) gives -1/99 so you would do that for a 1/891 chance. Whereas in PT the steep curve should mean people are against paying insurance.

Someone on  Hacker News submitted the link to the wikipedia article on the St. Petersburg Paradise and so I set up a python fiddle to mess about with this: http://pythonfiddle.com/st-petersburgh-paradox

The game as explained on Wikipedia is:
A casino offers a game of chance for a single player in which a fair coin is tossed at each stage. The pot starts at 1 dollar and is doubled every time a head appears. The first time a tail appears, the game ends and the player wins whatever is in the pot. Thus the player wins 1 dollar if a tail appears on the first toss, 2 dollars if a head appears on the first toss and a tail on the second, 4 dollars if a head appears on the first two tosses and a tail on the third, 8 dollars if a head appears on the first three tosses and a tail on the fourth, and so on. In short, the player wins 2k−1 dollars if the coin is tossed k times until the first tail appears.
The paradox is that no rational person would pay a substantial amount to play yet it has an infinite expected payout. How does my model resolve this?


One of the interesting things I found was that there is a power law for how much you would be willing to play the game. There are two ways to run the figures.

One is to never update your worth which produces (0,3.24)(1,4.31)(2,6.57)(3,9.56)(4,12.80)(5,16.11)(6,19.42)(7,22.75) - where (1,4.31) is an original worth of 10^1 and an expected value of 4.31.



The second is to update your worth after each winning bet:(0,1.54)(1,3.18)(2,5.55)(3,8.55)(4,11.80)(5,15.11)(6,18.43)(7,21.75). It doesn't make much difference.



In both instances a 10x increase in worth increase the expected value about 3.-3.3. So it would be 'correct' for someone with more money to pay more to play this game but it would be a smaller percentage of their worth. Interestingly in both versions, someone with only one unit should borrow to play the game.

I think there is more to be done here. So stay tuned.

Arrow Key Nav