Physics 380, 2011: Homework 6
Back to the main Teaching page.
Back to Physics 380, 2011: Information Processing in Biology.
- Analytically calculate the entropy of a Gaussian random variable.
- We start with a simple problem. In class, we have defined the mutual information between Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle X} and Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle Y} as a difference between a marginal and a conditional entropy, . Rewrite this expression to depend only on unconditional entropies. What does it say about the relation between the joint entropy of two variables and the two marginal entropies?
- How much information can a spiking neuron transmit? This is limited from above by its entropy rate. Let's represent a neuron as releasing action potentials with a Poisson process with a certain rate , and let's calculate the entropy rate of the Poisson process. First represent this process by discretizing time in intervals . Explain why the entropy of the Poisson generated sequence of duration (or, alternatively, symbols) is exactly proportional to time, that is , where is some constant. Thus we only need to calculate the entropy of a single symbol, this , in order to find the entropy rate as . Does this rate have a finite value as ? Why or why not?
- Graduate students: Suppose now the neuron has what's called a refractory period. That is, after a spike, a neuron cannot fire for the time . What is the entropy rate of such neuron?
- Consider information transmission in a simple model of a molecular circuit. The input signal of molecules is distributed as a Gaussian random variable with the mean and the variance Failed to parse (Conversion error. Server ("https://wikimedia.org/api/rest_") reported: "Cannot get mml. Server problem."): {\displaystyle \sigma _{s}} . The elicited mean response is , and it has the usual counting (a.k.a, Poisson or square root) fluctuations around the mean. Assuming that the response molecule count is large, , what is the mutual information between and ? (To answer this, you will need to understand what the marginal distribution of is.) Now suppose that there are independent signals , and each is measured by an independent response Failed to parse (Conversion error. Server ("https://wikimedia.org/api/rest_") reported: "Cannot get mml. Server problem."): {\displaystyle r_{n}} , such that, as above, Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \mu_{r_n}=g_n\mu_{s_n}} . What is the total mutual information between all signals on the one hand, and all responses on the other?