• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Let's gamble

Dissipate

Diamond Member
Suppose I had a fair coin and I made this offer to you. I flip the coin until the first tails appears. If the tails appears on the first toss I pay you $2. If it appears on the second toss I pay you $4 and in general, I pay you $2^k if the tails appears on the kth toss.

How much would you be willing to bet to play this game?
 
What a stupid game.


edit: so wait, if I wager $1 and the first tail comes on the 10th flip, I'll get $2^10 just for betting $1?

Sounds like a great game then.
 
I would gamble real small or none at all. For example ten dollar bet

One tail - 2$ .5 chance LOSS $8
Two tails - 4$ .25 chance LOSS $6
Three tails - 8$ .125 chance LOSS $2
Four tails - $16 .06125 chane GAIN $6

At only a 6% chance to make any money, its not worth it. But now that I think about it, what stops me from betting a dollar and never lossing money?!
 
Originally posted by: Dissipate
Suppose I had a fair coin and I made this offer to you. I flip the coin until the first tails appears. If the tails appears on the first toss I pay you $2. If it appears on the second toss I pay you $4 and in general, I pay you $2^k if the tails appears on the kth toss.

How much would you be willing to bet to play this game?

I don't get it. So no matter what happens, you always pay me... Or am I missing something obvious.
 
Ah, I had to do this problem in my statistics class, too. I forget the point at which you break even, though, and don't feel like trying to figure it out again.
 
Originally posted by: notfred
Ah, I had to do this problem in my statistics class, too. I forget the point at which you break even, though, and don't feel like trying to figure it out again.

Isn't the expected value of the game just...

2*(1/2) + 4*(1/4) + 8*(1/8) + 16*(1/16) + ...

which is the same as:

1 + 1 + 1 + 1 + ...

which is like, uh, basically infinity. So no matter what you bet, you expect to win more money than lose it in the long run.
 
Originally posted by: tikwanleap
Originally posted by: Dissipate
Suppose I had a fair coin and I made this offer to you. I flip the coin until the first tails appears. If the tails appears on the first toss I pay you $2. If it appears on the second toss I pay you $4 and in general, I pay you $2^k if the tails appears on the kth toss.

How much would you be willing to bet to play this game?

I don't get it. So no matter what happens, you always pay me... Or am I missing something obvious.

That's what I'm wondering.

OP, you need to reword your problem to include wagering. Otherwise as it stands, we can bet a penny each time and you'd still pay us $2^K.
 
Originally posted by: tikwanleap
Originally posted by: notfred
Ah, I had to do this problem in my statistics class, too. I forget the point at which you break even, though, and don't feel like trying to figure it out again.

Isn't the expected value of the game just...

2*(1/2) + 4*(1/4) + 8*(1/8) + 16*(1/16) + ...

which is the same as:

1 + 1 + 1 + 1 + ...

which is like, uh, basically infinity. So no matter what you bet, you expect to win more money than lose it in the long run.

No, that's not right. I don't remember exactly how to do it, though.

After running 50,000,000 simulated games and recording the winnings, I'll say I'm willing to bet $20 on this game, since the average winnings seem to be right around $22. I deserve to make an average of $2/game for the time spent playing it.

It only takes one really high winning streak to bring that average *WAY* up, though.
 
Originally posted by: Schfifty Five
Originally posted by: tikwanleap
Originally posted by: Dissipate
Suppose I had a fair coin and I made this offer to you. I flip the coin until the first tails appears. If the tails appears on the first toss I pay you $2. If it appears on the second toss I pay you $4 and in general, I pay you $2^k if the tails appears on the kth toss.

How much would you be willing to bet to play this game?

I don't get it. So no matter what happens, you always pay me... Or am I missing something obvious.

That's what I'm wondering.

OP, you need to reword your problem to include wagering. Otherwise as it stands, we can bet a penny each time and you'd still pay us $2^K.

That's why I said, what is the most you would be willing to wager.
 
Originally posted by: notfred
Originally posted by: tikwanleap
Originally posted by: notfred
Ah, I had to do this problem in my statistics class, too. I forget the point at which you break even, though, and don't feel like trying to figure it out again.

Isn't the expected value of the game just...

2*(1/2) + 4*(1/4) + 8*(1/8) + 16*(1/16) + ...

which is the same as:

1 + 1 + 1 + 1 + ...

which is like, uh, basically infinity. So no matter what you bet, you expect to win more money than lose it in the long run.

No, that's not right. I don't remember exactly how to do it, though.

After running 50,000,000 simulated games and recording the winnings, I'll say I'm willing to bet $20 on this game, since the average winnings seem to be right around $22. I deserve to make an average of $2/game for the time spent playing it.

It only takes one really high winning streak to bring that average *WAY* up, though.

So is the OP asking "what is the HIGHEST you would bet each time" or is he just asking "How much would WANT to bet".

If it's the latter, then I'd bet a penny or nothing if allowed....b/c his question doesn't state anything about minimum bets/wagers. I'll just bet a penny each time and I'm guaranteed at least $2 each time.
 
Originally posted by: Schfifty Five

So is the OP asking "what is the HIGHEST you would bet each time" or is he just asking "How much would WANT to bet".

If it's the latter, then I'd bet a penny or nothing if allowed....b/c his question doesn't state anything about minimum bets/wagers. I'll just bet a penny each time and I'm guaranteed at least $2 each time.


If I am going to sell you a car I will probably ask you at some point: what is the highest amount you are willing to pay for this car.

It is the same thing here. I am asking you how much you would be willing to pay to play this game. You may want to wager only a penny but that doesn't mean I would let you wager just a penny.
 
I change my answer. I'll bet $10 on this game.

C code to simulate 100,000,000 plays of this game:

#include <stdio.h>
#include <stdlib.h>
#include <math.h>
#include <time.h>

int main(){
// Total amount won so far.
double won = 0;

// Number of games to play.
int games = 100000000;

// Seed the random number generator
srand(time(NULL));

// Simuate however many games.
int i = 0;
for (i = 0; i < games; i++){
int roll = rand(); // Generate a random number.
int rolls = 1; // This was our first roll.
while(roll % 2){ // if it's odd, we continue.
roll = rand(); // get a new random number.
rolls++; // increment the number of rolls in the game.
}
// We win 2^number_of_rolls dollars.
double win = pow(2.0, (double)rolls);
won += win; // Add our winnings to the total winnings.
}

// Divide total winnings by number of games for the average winnings.
double avg = won / (double)games;

// Print out the average winnings.
printf("Average winnings: %f\n", avg);
}

It's telling me the average winnings after 100,000,000 games is about $16.17 per game.
 
Originally posted by: notfred
Originally posted by: Dissipate
Originally posted by: notfred
It's telling me the average winnings after 100,000,000 games is about $16.17 per game.


Try increasing the number of games played.

I could, but it already takes about 15 seconds to run.


take out the variable declarations outside the for loop. You are creating tooo many instances of the variables, hence wasting memory. (I think)

Declare the variables outside the for loop at the start. You can always initialize them (and reset them) to whatever you want inside the loop.
 
Originally posted by: gsethi
Originally posted by: notfred
Originally posted by: Dissipate
Originally posted by: notfred
It's telling me the average winnings after 100,000,000 games is about $16.17 per game.


Try increasing the number of games played.

I could, but it already takes about 15 seconds to run.


take out the variable declarations outside the for loop. You are creating tooo many instances of the variables, hence wasting memory. (I think)

Declare the variables outside the for loop at the start. You are always initialize them (and reset them) to whatever you want later.

They pass out of scope at the end of the loop anyway. I guess I could be wasting a little bit of time on memory allocation, but I'm not "wasting memory". I'm not exactly sure how the compiler with various optimizations enabled actually manages memory allocation, but I don't think that free() is actually called at the end of each iteration of the loop. I think the program will actually retain the memory allocated for those two integers across iterations of the loop, and just reuse the same block of memory each time they're declared. I guess I could test this by creating a loop and not initializing the values and seeing if they remain set as thier previous values.

EDIT:
Yep, just tested it. Even with redeclaring the variables inside the loop, I'm getting the exact same memory addresses for each iteration of the loop, making me think that the compiler isn't freeing the memory between iterations.
 
Originally posted by: Dissipate
Originally posted by: Schfifty Five
Originally posted by: tikwanleap
Originally posted by: Dissipate
Suppose I had a fair coin and I made this offer to you. I flip the coin until the first tails appears. If the tails appears on the first toss I pay you $2. If it appears on the second toss I pay you $4 and in general, I pay you $2^k if the tails appears on the kth toss.

How much would you be willing to bet to play this game?

I don't get it. So no matter what happens, you always pay me... Or am I missing something obvious.

That's what I'm wondering.

OP, you need to reword your problem to include wagering. Otherwise as it stands, we can bet a penny each time and you'd still pay us $2^K.

That's why I said, what is the most you would be willing to wager.

Uh, no you didn't. You said "How MUCH" would you be willing to wager. That's not the same as asking what is the MOST.
 
Originally posted by: Dissipate
Originally posted by: Schfifty Five

So is the OP asking "what is the HIGHEST you would bet each time" or is he just asking "How much would WANT to bet".

If it's the latter, then I'd bet a penny or nothing if allowed....b/c his question doesn't state anything about minimum bets/wagers. I'll just bet a penny each time and I'm guaranteed at least $2 each time.


If I am going to sell you a car I will probably ask you at some point: what is the highest amount you are willing to pay for this car.

It is the same thing here. I am asking you how much you would be willing to pay to play this game. You may want to wager only a penny but that doesn't mean I would let you wager just a penny.

Well then perhaps the question should be "What is the highest wager you can bet before the odds are against you"

You never said there was a condition on how much you would accept or "let us play" this game.

So an answer of 'wager nothing' is still 100% valid according to the original question.
 
Back
Top