I couldn't decide if I should start a new thread or reopen an old one, but since there seem to be a lot of Excel threads, and this one is related to my son's baseball team, I figured I would put it here. And this may be more of a math question than an Exel one anyway.
So we are running a Super Bowl squares as a fundraiser for his summer team as discussed above. The plan is that half the proceeds will go the team, and half will pay out to the four winning boxes. But here's the thing: any unsold boxes will be "held" by the team, so the if any one of those empty boxes hits, then the team just adds those winnings to the kitty. So I'm trying to figure out how to calculate expected winnings based on number of boxes sold.
Without the empty boxes issue it's pretty simple: (100 * [$ per box]) - ([# of boxes sold] * [$ per box]). So the more boxes sold, the more money the team gets. Or put another way, once we sell half the boxes, every box sold goes straight to the bottom line.
But an unsold box has a 1% change to hit and add money to the bottom line. The more unsold boxes, the higher the cumulative percentage that the team holds, right? IOW, if there are 25 unsold boxes, we have a 25% chance to hit, correct?
Do I simply multiply the cumulative percentage against the possible winning for each quarter to determine "expected winnings." Is that even the right way to look at it?
If it matters, our current plan is $20/box; $125 for 1st and 3rd quarters; $250 for halftime; and $500 for final score (not end of 4th Q).
So ultimately is the formula [net of sold boxes] + ["expected winnings" of 1st Q] + ["expected winnings" of 2nd Q] + ["expected winnings" of 3rd Q] + ["expected winnings" of final]?? Or is it more of a binomial probability type calculation??