Physics 380, 2011: Homework 6
Back to the main Teaching page.
Back to Physics 380, 2011: Information Processing in Biology.
- Analytically calculate the entropy of a Gaussian random variable.
- We start with a simple problem. In class, we have defined the mutual information between and Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle Y} as a difference between a marginal and a conditional entropy, Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle I[X;Y]=S[X]- S[X|Y]} . Rewrite this expression to depend only on unconditional entropies. What does it say about the relation between the joint entropy of two variables and the two marginal entropies?
- How much information can a spiking neuron transmit? This is limited from above by its entropy rate. Let's represent a neuron as releasing action potentials with a Poisson process with a certain rate Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle r} , and let's calculate the entropy rate of the Poisson process. First represent this process by discretizing time in intervals Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \Delta t} . Explain why the entropy of the Poisson generated sequence of duration Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle T} (or, alternatively, Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle n=T/\Delta t} symbols) is exactly proportional to time, that is Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle S=sn} , where is some constant. Thus we only need to calculate the entropy of a single symbol, this Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle s} , in order to find the entropy rate as . Does this rate have a finite value as ? Why or why not?
- Graduate students: Suppose now the neuron has what's called a refractory period. That is, after a spike, a neuron cannot fire for the time . What is the entropy rate of such neuron?
- Consider information transmission in a simple model of a molecular circuit. The input signal of molecules is distributed as a Gaussian random variable with the mean and the variance . The elicited mean response is , and it has the usual counting (a.k.a, Poisson or square root) fluctuations around the mean. Assuming that the response molecule count is large, , what is the mutual information between and ? (To answer this, you will need to understand what the marginal distribution of is.) Now suppose that there are independent signals , and each is measured by an independent response , such that, as above, . What is the total mutual information between all signals on the one hand, and all responses on the other?