Back to the main Teaching page.
Back to Physics 380, 2011: Information Processing in Biology.
We are proceeding with the dynamical information processing block. In the next couple of lectures, we will follow the article by Detwiler et al., 2000.
Warm up question
How do we calculate the amount of information processed in the vertebrate photoreceptor?
Main lecture
- We follow Detwiler et al., 2000, article in this lecture as well.
- Let denote the noise and the signal in a quantity, respectively. Then . This can be rewritten as .
- With noise, this gives: . Here we use . In reality, noises in complex reactions are not as simple. See Sinitsyn and Nemenman, 2007.
- The noise clearly contributes to the variance of , but so does the noise in the enzyme. Overall: .
- What is ? It's a random variable. Hence is also a random variable. We need to calculate its mean and variance.
- Let's calculate the mean and the variance of each Fourier component. The mean is zero.
- To calculate variances, we will need the Fourier transforms of functions. . And hence
- Variance is the mean square of the distance of the random variable from its mean (zero in this case). Recalling that components are now complex, we need for the variance . This is given by the Wiener-Khinchin theorem as
- The expression for any quantity X is called the spectrum.
- Thus:
- The signal and the noise get "filtered" by the system. But notice that all frequency components are independent from each other. So each component will have its own signal and noise.
- We have studied information transmission in such systems: each component independently contributes to information. We have , where the spectrum of the signal is given by