Physics 434, 2012: Lecture 10
Revision as of 20:51, 3 October 2012 by nemenman>Ilya
Back to the main Teaching page.
Back to Physics 434, 2012: Information Processing in Biology.
We are wrapping up all the loose ends for the probability/random walks section of the class. We will return to some of the related questions in the later sections, of course. During this lecture, we also started the new block on information theory, Physics 434, 2012: Lectures 10-11.
Main Lecture
- Langevin equation. If a chemical species is produced in a reaction and degraded in a reaction , and all of reactions are independent, then the mean number of produced particles per time is and the variance is . If the number of production and degradation events is large, , then these terms can be approximated as Gaussians. We can, therefore, write , where .
- Notice that the noise term scales as and is larger than the deterministic term for small
- We can transform this into a (stochastic) differential equation by taking a limit , , where , and . Such is called a Wiener process, after Norbert Wiener, or white noise -- we will understand why later.
- As the last item in this block of the class, we have talked about gradient sensing in another organism, D. discoideum. It's a large bug, and so many of the constraints that E. coli had do not apply. We have introduced the Local Excitation -- Global Inhibition (LEGI) (Levchenko and Iglesias, 2002), which allows an organism to do a spatial, rather than a temporal comparison of chemical concentrations. The question then is: does what we have discussed in this "probability" block of the class apply? We discussed that a comparison is made locally, by a small volume (about a size of a molecular complex), and hence the arrival of molecules is very stochastic. Even for a large organism, the noise may be quite important!