Physics 380, 2011: Lecture 4
Back to the main Teaching page.
Back to Physics 380, 2011: Information Processing in Biology.
We are continuing our review of some basic concepts of probability theory, such as probability distributions, conditionals, marginals, expectations, etc. We will discuss the central limit theorem and will derive some properties of random walks. Finally, we will study some specific useful probability distributions. In the course of this whole lecture block, we should be thinking about E. coli chemotaxis in the background -- all of these concepts will be applicable.
A very good introduction to probability theory can be found in Introduction to Probability by CM Grinstead and JL Snell.
Workup questions
- Consider a neuron. Action potentials are generated by fluxes of ions through the channels in the neural membrane (read Dayan and Abbott, 2005). The channels open and close independently, with an exponentially distributed time in each state, and in the closed state they don't let ions path through. What is a better strategy to ensure that the neuron's voltage is nearly deterministic: one big channel, or many small ones?
- Now consider an idealized spherical cell of radius whose entire surface is covered with disk-like receptors of radius . This is a reasonably good model for an immune cell, such as a mast cell. There are of such receptors. Using the Berg-Purcell limit from the first lecture, we know that the accuracy of determination of the concentration by a single receptor is , where is the diffusion coefficient and is the observation time. Since we have receptors, we use the law of large numbers to calculate that the overall accuracy of the concentration determination by the cell should be . On the other hand, if we consider the entire cell a single large receptor of size , the Berg-Purcell limit gives: . Can you reconcile the differences between these two estimates?
Main Lecture
- We are still answering the question: what will the distribution of E. coli positions be if it starts at 0 and moves for time
- Central limit theorem: sum of many i.i.d. random variables (with finite variances) approaches a certain distribution, which we call a Gaussian distribution. This is the most remarkable law in the probability theory. It is supposed to explains why experimental noises are often Gaussian distributed as well. More precisely, suppose are i.i.d. random variables with mean and variance . Then the CLT says that is distributed according to (called the standard normal distribution), provided is sufficiently large. We prove this assuming that none of the cumulants of the i.i.d. variables is infinite.
- The same holds if the variables have different variances and means, but all variances are bounded. Convergence will be slower though.
- The central limit distribution has only the first two cumulants that are nonzero. What is this distribution? It's a Gaussian with a given mean and a variance. We show this.
- Numerical simulation of the CLT for exponential and binary distributions: CLT.m
- Generation of exponential random numbers: log of uniform random number is an exponentially distributed random number.
- E. coli motion has a Gaussian distribution of end points -- it's a diffusive motion as well, just like diffision of small molecules. We demonstrate this by numerical simulations.
- Distributions:
- normal: diffusive motion
- -distribution: deterministic limit ; .
- multivariate normal: , here is the covariance matrix
- Random walk and diffusion:
- CLT explains why some of the details of the E.coli motion that we glanced over are not that important -- long term behavior of the motion is largely independent on other cumulants, but the first and the second.
- Unbiased random walk in 1-d: steps of length each. For the total displacement, and
- Conventionally, for a diffusive process: and , where is the dimension. So, random walk is an example of a diffusive process on long time scales, and for this random walk: and .
- Biased walk gets .
- multivariate random walk: , , where is the dimension, and . We derive this by noting that diffusion/random walk in every dimension is independent of the other dimensions.
- E. coli chemotaxis as a biased random walk: going up the gradient of an attractant, time to a tumble increases. This is described very well in (Berg 2000, Berg and Brown 1972).
- If going up the gradient run time increases as , then the expected displacement over a single run in the direction parallel to the gradient is , and it is zero perpendicular to the gradient. Adding many such runs, get a biased random walk: E. coli moves preferentially to better areas.
- Does the E. coli actually find the greener pastures with this protocol? looking at nearby points , closer than the length of a single typical run, with concentration at higher than at . Then , where is the mean waiting time to a tumble at a concentration . Similarly, . In steady state: . Therefore, , so that is higher in the direction where increases. We can now compare all points in a chain, and receive a similar expression for all. Note: E coli doesn't actually decrease its run time when going down gradient. Note: this is an example of a detailed balance calculation.
- Simulations of E. coli trajectories and intro to Matlab. See Matlab simulation code.