Law of Large Numbers
We all have an inquisitiveness of experimenting and finding new and intriguing ideas in different parts of our life and studies. But one question we must ask ourselves is that are we able to get accurate results from the limited experiments we do on a particular subject?
The Law of Large Numbers is a concept of probability used in statistics which states that as the size of a sample grows or rises, its mean gets closer and closer to the average or mean of the whole population. This law in the financial context has a whole different connotation or inference, which is related to the growth rate.
The large numbers theorem in statistics proves that if the same study or experiment is repeated independently a great number of times, the average of the results of the experiments or trials will be close to the expected value. The average or mean of results becomes closer to the expected value as the frequency of trials increases. The law of large numbers is an essential concept in statistics because it states that even random trials with a large number of trials may return stable, accurate, and long-term results. It is essential to understand that the average of the results of the experiment repeated a small number of times might give a substantially different value from that of the expected value. However, each additional trial increases the precision or accuracy of the average result.
COIN FLIPPING EXAMPLE
Now, let’s look at coin flips. This is a Bernoulli Trial as there are primarily two outcomes, heads and tails. The data are binary and follow the binomial distribution as provided by a proportion of events. For this scenario, an event as heads in the coin toss is defined. A coin toss is considered to be one trial. The law of large numbers proposes that when the frequency of trials increases, the proportion will converge and eventually coincide with the expected value of 0.50 (a head or a tail having a probability of 0.50 in each flip).
The sample proportion becomes more stable and accurate. It converges with the expected probability value of 0.50 as the sample size is increased.
IN BUSINESS AND FINANCE TERMS
In business, the term “law of large numbers” is majorly used in relation to growth rates, in terms of a percentage. It suggests that, as a business grows, the percentage rate of growth becomes increasingly strenuous to maintain.
The law of large numbers does not mean that a given sample or group of successive samples will always reflect the true and accurate population characteristics, true for especially small samples or a small number of results taken into consideration. It can also be inferred that if a given sample or series of samples drifts away from the true population mean or expected value, the law of large numbers does not strongly or firmly guarantee that successive samples will move the observed average toward the population mean (as provided by the Gambler’s Fallacy). Basically, the concept of LLN in finance tells, that as a business grows, it gets hard to maintain the previous growth rates.
Let’s assume there are two companies namely, ABC Ltd. and XYZ Ltd. The market capitalization of the companies ABC and XYZ are $1 million and $100 million respectively. Let’s further assume the growth of ABC is 50% for the current year, it is attainable as it will grow further by $500,000. But the same is not possible for the company XYZ as it will have to grow by $50 million to attain the 50% growth rate point. Thus, as company XYZ Ltd. will continue to expand, the growth rate of the company will fall or decline.
TYPES OF LAW OF LARGE NUMBERS
There are two main parts of the law of large numbers. They are: the weak law and the strong law of large numbers. The difference between the two is mostly theoretical.
THE WEAK LAW: The mean of a sample gets nearer to, i.e., converges the population mean as the sample size grows bigger. This law is known as the Weak Law of Large Numbers or the Bienaymé–T Chebyshev Inequality. In other words, The Weak Law of Large Numbers proves that the sample average value converges in probability towards the expected value of the population.
THE STRONG LAW: The strong law of large numbers (also called Kolmogorov’s law) tells that the sample average converges almost surely with the expected value of the population.
The Strong Law of Large Numbers tells that when the sample size grows infinite times the average mean will converge to the one for sure.
THE DIFFERENCE - Weak law of large numbers provides that it is a probability that the sample average will converge towards the expected value whereas Strong law of large numbers shows almost sure convergence. Weak law has a probability tending to 1 whereas Strong law has a probability equal to 1.
The law of large numbers plays an essential role because it guarantees accurate results from the averages of random events. For instance, while playing roulette, a casino may lose a sum of money in a single spin of the wheel, though its earnings will get closer to a predictable average value over a large number of spins. Any winning or losing streak by a player will certainly be overcome by the parameters of the game. It is integral to remember that the law of large numbers only applies when a large number of observations are taken into consideration. This principle does not work when a small number of observations is considered. Greater frequency of observations will eventually let the averages coincide with the expected value and that a streak of one value will immediately be balanced and compensated by the others.
Recommended » Benford's Law