Posted on 07/17/2010 6:55:34 AM PDT by mattstat
The estimable Daniel Bernoulli gave us this problem, one of the first creations of decision theory. You have to pay a certain amount of money to play the following game:
A pot starts out with one dollar. A coin is then tossed. If a head shows, then the amount in the pot is doubled. If a tail shows, the game is over and you win the pot, else the coin is re-flipped repeatedly until a tail appears. How much should you pay to play?
Suppose you pay ten bucks and the coin shows a tail the very first throw. You win the dollar in the pot, but it costs you a bundle. You wont make any money unless a tail waits until at least the fifth throw.
The standard solution begins by introducing the idea of expected value. This is usually a misnomer, because the expected value is often one that you do not expect or is impossible. Its formal definition is this: the weighted sum of everything that can happen multiplied by the probability of everything that can happen.
For example, the expected value of a die roll is:
EV = 0.167*1 + 0.167*2 + 0.167*3 + 0.167*4 + 0.167*5 + 0.167*6 = 3.5,
where 0.167 is 1/6, the probability of seeing any result. This says we expect to see 3.5, which is impossible. The dodge we introduce...
(Excerpt) Read more at wmbriggs.com ...
Thereby defining federal bureaucracy...
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.