Physics 434, 2016: Law of large numbers

From Ilya Nemenman: Theoretical Biophysics @ Emory
Revision as of 11:28, 4 July 2018 by Ilya (talk | contribs) (1 revision imported)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search
Emory Logo

Back to the main Teaching page.

Back to Physics 434, 2016: Physical Biology.

One of the important results in probability theory is the law of large numbers, which is what we consider in this lecture.

Let's ask the following question. Suppose we have a biased coin, with an unknown probability for coming heads up (in other words, the coin is a Bernoulli random number, with probability ). The coin is thrown times, and comes up times heads up. Clearly, in general, is not going to be exactly (for starters, is always rational, and can be arbitrary). But how close will they be? This is a very important question. Indeed, we previously defined the probability as a limit of frequencies for large . Is our definition self-consistent? Do the frequencies actually converge to probabilities?

To answer this, let's suppose we have a variable with an expectation value and the variance . We take samples of this variable. We then define the empirical mean, or the sample mean . Note here that we use the notation to denote expectation values, and to denote empirical means. Our questions of whether frequencies converge to probabilities is then a special case of a more general question: how close is an expectation value of a variable , namely , to its empirical mean ?

This question is easy to answer for a special case of frequencies of Bernoulli variables. Here we can use the fact that the distribution of heads is given by the binomial distribution, . As we showed previously, for the binomial distribution, , so that the frequency becomes . Thus, indeed, the frequency converges to the probability. Further, for the binomial distribution, , and thus the standard deviation of the frequency becomes . Thus the ratio of the standard deviation of the frequency to its mean is . So not only does the frequency of a Bernoulli variable converges to the probability, but the error decreases as about .

Does this result hold more generally, beyond Bernoulli variables? We previously showed that, for independent variables, means and variances ad. Let's use this fact to answer the question. What is the expected value and the standard deviation of the empirical mean? We can write these as . Using the law of summation of the means, this becomes . Thus the empirical mean converges to the true mean, as long as the true mean exist (for some long-tailed distributions, means don't exist, as we will see in the homework problem). How quickly does this convergence happen? We now calculate . Thus the spread of an expectation value around its empirical mean decreases with the number of samples as , provided, of course, that the variance of a single sample is finite.