Physics 380, 2011: Homework 6

From Ilya Nemenman: Theoretical Biophysics @ Emory
Revision as of 22:07, 2 October 2011 by nemenman>Ilya
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search
Emory Logo

Back to the main Teaching page.

Back to Physics 380, 2011: Information Processing in Biology.

  1. Analytically calculate the entropy of a Gaussian random variable.
  2. We start with a simple problem. In class, we have defined the mutual information between Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle X} and Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle Y} as a difference between a marginal and a conditional entropy, . Rewrite this expression to depend only on unconditional entropies. What does it say about the relation between the joint entropy of two variables and the two marginal entropies?
  3. How much information can a spiking neuron transmit? This is limited from above by its entropy rate. Let's represent a neuron as releasing action potentials with a Poisson process with a certain rate Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle r} , and let's calculate the entropy rate of the Poisson process. First represent this process by discretizing time in intervals Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \Delta t} . Explain why the entropy of the Poisson generated sequence of duration (or, alternatively, symbols) is exactly proportional to time, that is , where is some constant. Thus we only need to calculate the entropy of a single symbol, this , in order to find the entropy rate as . Does this rate have a finite value as ? Why or why not?
  4. Graduate students: Suppose now the neuron has what's called a refractory period. That is, after a spike, a neuron cannot fire for the time . What is the entropy rate of such neuron?
  5. Consider information transmission in a simple model of a molecular circuit. The input signal of molecules is distributed as a Gaussian random variable with the mean Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \mu_s} and the variance Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \sigma_s} . The elicited mean response is Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \mu_r=g\mu_s} , and it has the usual counting (a.k.a, Poisson or square root) fluctuations around the mean. Assuming that the response molecule count is large, Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle r\gg1} , what is the mutual information between Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle s} and Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle r} ? (To answer this, you will need to understand what the marginal distribution of is.) Now suppose that there are Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle N} independent signals Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle s_n} , and each is measured by an independent response Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle r_n} , such that, as above, Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \mu_{r_n}=g_n\mu_{s_n}} . What is the total mutual information between all signals on the one hand, and all responses on the other?