Physics 380, 2011: Homework 6

From Ilya Nemenman: Theoretical Biophysics @ Emory
Jump to: navigation, search
Emory Logo

Back to the main Teaching page.

Back to Physics 380, 2011: Information Processing in Biology.

  1. Analytically calculate the entropy of a Gaussian random variable.
  2. We start with a simple problem. In class, we have defined the mutual information between and as a difference between a marginal and a conditional entropy, . Rewrite this expression to depend only on unconditional entropies. What does it say about the relation between the joint entropy of two variables and the two marginal entropies?
  3. How much information can a spiking neuron transmit? This is limited from above by its entropy rate. Let's represent a neuron as releasing action potentials with a Poisson process with a certain rate , and let's calculate the entropy rate of the Poisson process. First represent this process by discretizing time in intervals . Explain why the entropy of the Poisson generated sequence of duration (or, alternatively, symbols) is exactly proportional to time, that is , where is some constant. Thus we only need to calculate the entropy of a single symbol, this , in order to find the entropy rate as . Does this rate have a finite value as ? Why or why not?
  4. Graduate students: Suppose now the neuron has what's called a refractory period. That is, after a spike, a neuron cannot fire for the time . What is the entropy rate of such neuron?
  5. Consider information transmission in a simple model of a molecular circuit. The input signal of molecules is distributed as a Gaussian random variable with the mean and the variance . The elicited mean response is , and it has the usual counting (a.k.a, Poisson or square root) fluctuations around the mean. Assuming that the response molecule count is large, , what is the mutual information between and ? (To answer this, you will need to understand what the marginal distribution of is.) Now suppose that there are independent signals , and each is measured by an independent response , such that, as above, . What is the total mutual information between all signals on the one hand, and all responses on the other?