Physics 434, 2014: Homework 7

From Ilya Nemenman: Theoretical Biophysics @ Emory
Jump to: navigation, search
Emory Logo

Back to the main Teaching page.

Back to Physics 434, 2014: Information Processing in Biology.

  1. Show that is equal to .
  2. What is the differential entropy of an exponential distribution?
  3. How much information can a spiking neuron transmit? This is limited from above by its entropy rate. Let's represent a neuron as releasing action potentials with a Poisson process with a certain rate , and let's calculate the entropy rate of the Poisson process. First represent this process by discretizing time in intervals . Explain why the entropy of the Poisson generated sequence of duration (or, alternatively, symbols) is exactly proportional to time, that is , where is some constant. Thus we only need to calculate the entropy of a single symbol, this , in order to find the entropy rate as . Does this rate have a finite value as ? Why or why not? Estimate the maximum bitrate of a neuron that can control placement of its spikes to the accuracy of 1 ms.
  4. Graduate students: Suppose now the neuron has what's called a refractory period. That is, after a spike, a neuron cannot fire for the time . What is the entropy rate of such neuron?
  5. Let's explore the channel coding theorem (and also learn how to do optimization in Matlab and Octave). Suppose we have the following molecular information processing channel: for any number of molecules on the input , the channel output is a Poisson variable with that mean (this is relevant to Ziv et al, 2007, which was discussed by Martin in one of the lectures). That is, . Write a code to estimate the mutual information over this channel for an arbitrary distribution of the (discrete) input signal. Use the input distribution as an input to the function you write. Explore different input distributions, assuming that the number of input molecules is between 0 and 64. What are the general features of the input distribution that achieve higher mutual information? Recall that Ziv et al. have shown that you should be able to send bits through this channel. Can you find a distribution that allows you to send close to these bits? Submit plots of your "most informative" distributions.