Physics 434, 2012: Homework 6
Revision as of 11:36, 25 October 2012 by nemenman>Ilya
Back to the main Teaching page.
Back to Physics 434, 2012: Information Processing in Biology.
- Let's explore the channel coding theorem (and also learn how to do optimization in Matlab and Octave). Suppose we have the following molecular information processing channel: for any number of molecules on the input , the channel output is a Poisson variable with that mean (this is relevant to Ziv et al, 2007, which was discussed by Martin in one of the lectures). That is, . Write a code to estimate the mutual information over this channel for an arbitrary distribution of the (discrete) input signal. Use the input distribution as an input to the function you write. Explore different input distributions, assuming that the number of input molecules is between 0 and 64. What are the general features of the input distribution that achieve higher mutual information? Recall that Ziv et al. have shown that you should be able to send bits through this channel. Can you find a distribution that allows you to send close to these bits? Now try to maximize this mutual information by changing the input distribution comprehensively using minimization routines (in Matlab, you will need to use the fminsearch functions, and in octave you will use the sqp function; Matlab and octave are not compatible for optimization routines). How many bits can you send through at the maximum? Submit plots of your "most informative" distributions.
- Open ended problem, not required: In class, we have considered a population of bacteria that choose between two different behavioral responses to environmental states. Now let's suppose that transitions among the environments are not totally random. That is, high stress environments are more likely after another high-stress environment than after a normal environment, and vice versa. Can you calculate the optimal phenotypic response strategies and the optimal growth rate in this case?
- We briefly touched on using mutual information to choose optimal reduced models for the data. Consider now three neurons: X, Y, Z. We need to understand if both X and Y project into Z and affect its spiking. Further, if both affect Z, we need to understand how big of an error we make by neglecting one of the projections. Download the following simulated neural spiking data. In this file, you will find a variable spikes, which is 3x1e4 in size. The rows are the activities of each of the neurons, and the columns are the time steps. Each entry in the matrix tells us how many times a particular neuron has fired in a particular time step. Estimate the mutual informations among the neurons and answer the two questions above.
- Open ended problem, not required: Now download the file of actual neural data (thanks to Ron Calabrese). This is the activity of six neurons in a leech heart. Can you analyze the data and understand the wiring diagram of this neural circuit?
- Assume that an enzyme can bind a substrate with the rate . It converts it to the product with the rate or releases it back at a rate . It can also bind a substrate with the rate , convert it to the product with the rate or releases it back at a rate . However, it cannot bind both substrates at the same time. Calculate the production rate of the product in this case in the quasi steady state limit.