Difference between revisions of "Physics 380, 2011: Homework 6"
nemenman>Ilya |
m (1 revision imported) |
(No difference)
|
Latest revision as of 11:28, 4 July 2018
Back to the main Teaching page.
Back to Physics 380, 2011: Information Processing in Biology.
- Analytically calculate the entropy of a Gaussian random variable.
- We start with a simple problem. In class, we have defined the mutual information between and as a difference between a marginal and a conditional entropy, . Rewrite this expression to depend only on unconditional entropies. What does it say about the relation between the joint entropy of two variables and the two marginal entropies?
- How much information can a spiking neuron transmit? This is limited from above by its entropy rate. Let's represent a neuron as releasing action potentials with a Poisson process with a certain rate , and let's calculate the entropy rate of the Poisson process. First represent this process by discretizing time in intervals . Explain why the entropy of the Poisson generated sequence of duration (or, alternatively, symbols) is exactly proportional to time, that is , where is some constant. Thus we only need to calculate the entropy of a single symbol, this , in order to find the entropy rate as . Does this rate have a finite value as ? Why or why not?
- Graduate students: Suppose now the neuron has what's called a refractory period. That is, after a spike, a neuron cannot fire for the time . What is the entropy rate of such neuron?
- Consider information transmission in a simple model of a molecular circuit. The input signal of molecules is distributed as a Gaussian random variable with the mean and the variance . The elicited mean response is , and it has the usual counting (a.k.a, Poisson or square root) fluctuations around the mean. Assuming that the response molecule count is large, , what is the mutual information between and ? (To answer this, you will need to understand what the marginal distribution of is.) Now suppose that there are independent signals , and each is measured by an independent response , such that, as above, . What is the total mutual information between all signals on the one hand, and all responses on the other?