The Gambler's Ruin
The Gambler’s Ruin Problem, a famous statistical scenario centered around probabilities and experimental outcomes. This problem is also illustrated as an application of unique Markov chains with interesting properties.
Imagine that a reluctant gambler is dragged to a casino by his friends. He’s very conservative when it comes to gambling so he only takes $50 in the casino to gamble with. Since he doesn’t know a whole lot about gambling, he decides to play roulette. The gambler places a very simple bet of $25 on red. So with every spin, if red occurs he will win $25 but if black occurs he will lose $25. Therefore the odds of winning or losing are almost 50% each. A thing to notice is that in casinos, the odds are always slightly uneven because if they were even the casino would never make money. Casinos set up games that are slanted in their favor and so are the payouts. The gambler sets up some simple rules - he decides that he will quit playing when he has either 0 money left or he is up by $25 i.e has $75. We can model this entire process as a Markov Chain and examine its long-term behavior. A Markov Chain is a model that helps in defining a sequence of all possible events where the probability of each event depends only on the state attained in the previous event. We realize that at any point in time, the only relevant information is the amount of money the gambler has available. How he got to that amount in the past is irrelevant. Firstly, we can set up a transition diagram to represent the possible evolution of this game:
There are 4 states in this transition diagram. The gambler is going to come into the casino with $50. There is a possibility that he loses all his money and goes broke (gets ruined) or he can end up winning up to $75 and then he stops and calls it a night. These two are the endpoints of our transition diagram. Since he is betting $25 dollars at a time there is also a state that he has $25 left with him. Now at each step, if he actually has the money to make a bet, the chance of winning or losing is 0.5 so for example if he has $25 and he makes a bet the chance of winning and going to $50 is 0.5. Let’s go ahead and set up a matrix using this transition diagram:
On the two ends we have what are called Absorbing States. An absorbing state is a state in which once you enter, there is no leaving. It’s almost like a black hole. Once the gambler is broke (or ruined), he’s always broke and once he wins $75 he quits and therefore always has that $75. So once he enters these absorbing states, he never leaves that state. In our transition matrix, on the left hand side we have the state we’re coming from and on the top we have the state we’re going to so you can see the one there in the top left that just means if he’s broke now he’ll be broke next time and on the lower right that just means if he had $75 he will always have $75 because he stopped playing and then the point five probabilities show the various transition stages so for example the probability of going from the $25 state to the broke state is 0.5. This matrix showcases the exact same information that is present in the transition diagram in matrix form.
What do we find out when we analyze this matrix and look at some long-run probabilities of this game? Let’s start by taking the transition matrix 2 steps into the future.
Looking at the second row second column we get a 0.25 probability. This means that if our gambler starts with $25 when he walks into the casino and plays this game the probability of having the $25 in his pocket two spins of the game from now is 0.25 or 25%. We can even take this further and see what we would expect 10 spins into the game:
If we interpret the above matrix, we can see that after 10 spins of the wheel if he walks in with $25 the probability that he will be broke 10 spins from now is 0.667 but if we go down to the next row we see that if he comes in with $50 the chance of him being broke after 10 spins is 0.333. So we can see that the probability of being broke or ending up with $75 is dependent on the amount of money he started with as well.
If we run this matrix into the future, we see that the probability of being broke if he comes in with $25 dollars is ⅔ but is ⅓ if he comes with $50. Similarly, the probability of having $75 if he comes in with $25 is 2/3 and is 2/3 if he comes with $50.
It might be difficult to believe that, given a fair game, the probability that someone will win their desired amount or get ruined is determined by their initial wealth. Using Markov chains, we can determine the same probabilities between any sequences of games using the transition matrix. This concept has specific relevance towards gambling, however it is also used in various mathematical theorems with wide applications in probability and statistics.
Recommended » Benford's Law