Physics 380, 2012: Homework 12

From Ilya Nemenman: Theoretical Biophysics @ Emory
Jump to: navigation, search
Emory Logo

Back to the main Teaching page.

Back to Physics 380, 2011: Information Processing in Biology.

This week we will work to understand adaptation in biological circuits

  1. Let's try to realize circuits that would exhibit an adaptation to the mean.
    • For starters, let's consider the system similar to those that we have studied before. Let the signal activate the response by means of a Hill law with the HIll exponent of 2, and then the response is degraded with a usual linear degradation term. Let's now introduce a memory variable that is activated by the response in a similar Hill fashion and degraded (linearly) at a very slow time scale compared to the response. Finally, the memory feeds back (negatively) into the response, so that the maximum value of the response production is itself a (repressive) Hill function of the memory, but now with the Hill exponent of 1. Write a Matlab script that would take a certain signal trace on the input and produce a corresponding response for this system as the output. Do not consider the effects of noise.
    • Consider a signal that has a value of 1 for a time much longer than the inverse of either the response or the memory degradation rate, and then switches to a value of 2 and stays there for an equally long time. The response will then exhibit some initial relaxation to the steady state response. It will then jump briefly following the change in the signal, and relax closer to (but not exact to) the original steady state value. Observe this in your simulations. Find the parameters of the system (the maximum production rates, the degradation rates, and the Michaelis constants) that would allow the system to be very sensitive to the changes in the input, but yet adapt as close to perfectly as possible. That is, search for the parameters such that the jump in the response following the step in the signal is many-fold (try to make it as large as possible), and yet the system relaxes back as close as possible to its pre-step steady state value.
    • You will realize that there is a tradeoff here: high sensitivity to step changes makes it hard to adapt back perfectly. It might be worthwhile reading Ma et al, 2009, where this is discussed in depth. Report the best simultaneous values of the fold-change in the response after a step in the stimulus, and the fold-change in the steady state after the relaxation and the corresponding parameters you found.
  2. Grad students: Now let's modify the code above to make the circuit adapt to the variance of the signal to some extent as well.
    • Consider a signal of a form , where are positive constants, and is such that it is much larger than the inverse of the degradation rate of the response but much smaller than the inverse of the degradation rate of the memory. Let be 1 for a long while and then switch to 2. Observe that the standard deviation of the response will jump at the transition, and then settle down, similarly to the response itself in the previous problem.
  3. Show that '"reversal to status quo ante" in Gallistel et al., 2001 (it will be presented in class on tuesday), can be modeled if the animal's estimate of the ratio of food deposition rates, is obtained as , where is the value of the ratio observed at time and is the memory kernel that "forgets" as a power law, . To do this, take a time series of that has a step, with gaps when the animal cannot observe the events, and show that application of the kernel will produce the estimate that will reverse to the status quo.