Physics 380, 2010: Basic Probability Theory
Back to Physics 380, 2010: Information Processing in Biology.
Lectures 2 and 3
During these lectures, we will review some basic concepts of probability theory, such as probability distributions, conditionals, marginals, expectations, etc. We will discuss the central limit theorem and will derive some properties of random walks. Finally, we will study some specific useful probability distributions.
A very good introduction to probability theory can be found in Introduction to Probability by CM Grinstead and JL Snell.
- Random variables: motion of E. coli, time to neural action potential; diffusion and first passage
- Sample space, events, probabilities -- probability space
- Properties of distributions:
- nonnegativity:
- unit normalization:
- nesting: if then
- additivity (for non-disjoint events):
- complementarity
- Continuous and discrete events: probability distributions and densities or
- Cumulative distributions
- Change of variables for continuous and discrete variates , for multi-dimensional variables
- Distributions:
- uniform: probability of emitting a spike or doing a tumble by an E.coli.
- exponential: time to the next action potential at constant rate .
- Poisson: number of action potentials in a given interval; number of E. coli tumbles;
- normal: diffusive motion
- -distribution: deterministic limit ; .
- Conditional and joint probabilities, Bayes theorem:
- independence: two variables are independent if and only if , or, equivalently, or .
- Distributions:
- multivariate normal: , here is the covariance matrix
- Expected values: . In particular, a few of the expectation values are very common: the mean, , and the variance .
- addition of independent variables: in general, , and , provided and are independent, that is, .
- Moments, central moments, and cumulants
- moments:
- central moments: : distribution mean, width, asymmetry, flatness, etc...
- cumulants: , , , and higher order cumulants measure the difference of the distribution from a Gaussian (all higher cumulants for a Gaussian are zero)
- Moment and cumulant generating functional: the Gaussian integral
- Moment generating function (MGF): . The utility of MGF comes from the following result: .
- Properties of MGF:. From this we can show that if , that is, , then .
- Cumulant generating function (CGF): . Then the cumulants are:
- Frequencies and probabilities: Law of large numbers. If , then and . Thus the sample mean approaches the true mean of the distribution. See one of the homework problems for week 2.
- Central limit theorem: sum of i.i.d. random variables approaches a Gaussian distribution. See one of the homework problems for week 2.
- Random walk and diffusion:
- Unbiased random walk in 1-d: steps of length each. For the total displacement, and
- Conventionally, for a diffusive process: and , where is the dimension. So, random walk is an example of a diffusive process on long time scales, and for this random walk: and .
- Biased walk gets .
- multivariate random walk: , , where is the dimension, and . We derive this by noting that diffusion/random walk in every dimension is independent of the other dimensions.
Homework (due Sep 10)
- (Problem 1.2.1 in Grinstead and Snell). Let be a sample space. Let , and . Find the probabilities for all eight subsets of .
- Exponential, , Poisson , and multivariate Gaussian (where is the dimensionality of ) probability distributions are some of the most important distributions that we will see in this class. Calculate the means and the variances for these distributions. Note that for the Gaussian distribution, the easiest way to calculate the mean and the variance is to calculate the moment generating functional first and then differentiate it. Undergraduates: work with 1-dimensional Gaussians, where . Graduate students: Calculate the covariance for the multivariate normal distribution. Pay attention how we do integrals over Gaussians -- we will use this over and over in this class. Also note that logarithms of moment generating functionals are called cumulant generating functionals, and they are often easier to work with. We will denote them as . Note that .
- An E. coli moving on a 2-dimensional surface is being tracked in an experiment. It chooses a direction at random and runs, then tumbles and reorients randomly, runs for the second time, tumbles yet again, and keeps running. What is the probability that all three of the directions that it chooses all fall not farther than from each other. That is, what is the probability that the bacterium moves in roughly speaking the same direction all three times? For graduate students: Can you generalize this for tumbles, instead of three?
- In class we discussed an approximation for the motion of E. coli, where the bacterium would tumble every seconds, moving with the velocity of between the tumbles. We have concluded that the long-term displacement of the bacterium can be well characterized by diffusion: mean displacement is zero, and .
- Calculate the coefficient of proportionality for this relation for the bacterium in one dimension. By convention, for a diffusion in dimensions, we write: , where is the diffusion coefficient. What is the diffusion coefficient for this model?
- Let's now improve the model and say that E. coli tumbles at random times, and the distribution of intervals between two successive tumbles is the exponential distribution with the mean .
- Derive the distribution of the number of times the E.coli will tumble over a time .
- Remember that means and variances of independent random variables add and use this fact repeatedly to calculate the mean and the variance of the displacement of E. coli in this model (still in 1 dimension). Is it still described well by a diffusion model? What is the diffusion coefficient?
- For Grads: If we complicate the model even further, and say that the velocity for each run is sampled independently from , does this change the diffusive behavior?
- What should we do to the distributions run durations (and velocities) to violate the diffusive limit?
- The law of large numbers states that when a random variable is independently sampled from a distribution many times, its sample mean approaches the mean of the distribution. We have almost showed this in class, but stopped a bit short. Let's finish the work. Recall that, when independent random variables are summed, means add and variances add (if both exist). Use this to show that the mean of a sample of independent, identically distributed (denoted: i.i.d.) variables (with mean and variance ), namely , has the mean equal to , and the variance equal to . Therefore, as grows, becomes closer and closer to , proving the law.
- The most remarkable law in the probability theory is the Central Limit Theorem (CLT). Its colloquial formulation is as follows: a sum of many i.i.d. random variables is almost normally distributed. This is supposed to explains why experimental noises are often normally distributed as well. More precisely, suppose are i.i.d. random variables with mean and variance . Then the CLT says that is distributed according to (called the standard normal distribution), provided is sufficiently large.
- Using either Matlab, Excel, or any other package, generate a sequence of random variables uniformly distributed between 0 and 1. Calculate for them. Do this a 100 times and histogram the resulting 100 values of . Does the histogram look as if it's coming from a standard normal?
- For graduate students. Let's prove the CLT.
- First show that if (where all three are random variables), then , or, alternatively, . In particular, this means that, for , we have .
- Write to the first few orders in the Taylor series in . Use the identity to show that approaches the moment generating functional for a standard normal as .