Physics 380, 2011: Lecture 18

From Ilya Nemenman: Theoretical Biophysics @ Emory
Jump to: navigation, search
Emory Logo

Back to the main Teaching page.

Back to Physics 380, 2011: Information Processing in Biology. We are proceeding with the dynamical information processing block. In the next couple of lectures, we will follow the article by Detwiler et al., 2000.

Warm up question

How do we calculate the amount of information processed in the vertebrate photoreceptor?

Main lecture

  1. We follow Detwiler et al., 2000, article in this lecture as well.
  2. Let denote the noise and the signal in a quantity, respectively. Then . This can be rewritten as .
  3. With noise, this gives: . Here we use . In reality, noises in complex reactions are not as simple. See Sinitsyn and Nemenman, 2007.
  4. The noise clearly contributes to the variance of , but so does the noise in the enzyme. Overall: .
  5. What is ? It's a random variable. Hence is also a random variable. We need to calculate its mean and variance.
  6. Let's calculate the mean and the variance of each Fourier component. The mean is zero.
  7. To calculate variances, we will need the Fourier transforms of functions. . And hence
  8. Variance is the mean square of the distance of the random variable from its mean (zero in this case). Recalling that components are now complex, we need for the variance . This is given by the Wiener-Khinchin theorem as
  9. The expression for any quantity X is called the spectrum.
  10. Thus:
  11. The signal and the noise get "filtered" by the system. But notice that all frequency components are independent from each other. So each component will have its own signal and noise.
  12. We have studied information transmission in such systems: each component independently contributes to information. We have , where the spectrum of the signal is given by