Revision as of 08:26, 13 September 2012 by nemenman>Ilya
Back to the main Teaching page.
Back to Physics 434, 2012: Information Processing in Biology.
We are continuing our review of some basic concepts of probability theory, such as probability distributions, conditionals, marginals, expectations, etc. We will discuss the central limit theorem and will derive some properties of random walks. Finally, we will study some specific useful probability distributions. In the course of this whole lecture block, we should be thinking about E. coli chemotaxis in the background -- all of these concepts will be applicable.
A very good introduction to probability theory can be found in
Introduction to Probability by CM Grinstead and JL Snell.
Warmup question
- What is the mean and the variance of the position of E. coli at the end of a single run, if the bacterium started at the origin? Let's suppose that the motion is 1-dimensional.
- Solve the same problem for 2 and 3 dimensional version at home.
Main lecture
- Multivariate distributions
- Conditional and joint probabilities, Bayes theorem:
- independence: two variables are independent if and only if , or, equivalently, or .
- Change of variables for continuous and discrete variates
- Grad students: for multi-dimensional variables
- How do we generate an exponentially distributed random variable? Log of a uniform random number is an exponentially distributed random number.
- Addition of variables:
- If and are independent, that is, , then we prove similarly that .
- Moment generating functional
- Moment generating function (MGF): . The utility of MGF comes from the following result: . That is: .
- Properties of MGF:.
- If , then .
- Example: We explicitly calculate the MGF for the Poisson distribution: .
- Cumulant generating function (CGF): . Then the cumulants are: . That is: . Cumulants are another version of moments of the distribution.
- We can show how the first few cumulants are related to the moments:
- Similarly we show that.
- CGF of independent variables add.