Difference between revisions of "Physics 434, 2012: Homework 5"
nemenman>Ilya |
m (1 revision imported) |
(No difference)
|
Latest revision as of 11:28, 4 July 2018
Back to the main Teaching page.
Back to Physics 434, 2012: Information Processing in Biology.
- Consider a biochemical system for production of a protein that is somewhat different from what we explored in class: number of proteins increases by one with a rate and it decreases with the rate . This happens when the protein is involved in a self-degradation, as is widely believed to be true for development (Eldar et al., 2003) Derive the deterministic equation describing the system. Then derive the equations describing its stochastic dynamics: the master, the Langevin, and the Fokker-Planck equations. Let's explore how well all four of these descriptions agree with each other. Write programs to solve all four of these equations (see below for suggestions). Start your simulations with and and . Explore different values of and answer the following questions: (a) What are the conditions, under which all four different simulation techniques largely agree with each other? (b) Which simulations are closer to each other: Langevin vs. Fokker-Planck or Langevin vs. master equation? (c) Under which conditions is each of the two pairs close to each other? Illustrate your answers with graphs. The following should be helpful when you write your simulations code:
- Solve the deterministic chemical kinetics equation numerically using the Euler stepping method. That is:
- For the master equation, formally the array of probabilities is infinite dimensional. But the probabilities for , where is the average value of obtained from the equivalent deterministic equation, are extremely small. So we may keep track of a finite set of probabilities that are not too much larger than the the expected deterministic mean. Then one can do Euler stepping again, now for each of , to calculate the dynamics of each of these probabilities.
- For the Fokker-Planck equation, one can specify at discretized values of and approximate spatial derivatives with finite differences, as you did in calculus. And then use the Eurler method again.
- Finally, for the Langevin equation, we get a single random trajectory at a time. One must then generate many such random trajectories and histogram them to get an approximation to .
- Compare distributions from Langevin, Fokker-Planck, and master simulations by plotting them on top of each other.
- Analytically calculate the entropy of a Gaussian random variable.
- How much information can a spiking neuron transmit? This is limited from above by its entropy rate. Let's represent a neuron as releasing action potentials with a Poisson process with a certain rate , and let's calculate the entropy rate of the Poisson process. First represent this process by discretizing time in intervals . Explain why the entropy of the Poisson generated sequence of duration (or, alternatively, symbols) is exactly proportional to time, that is , where is some constant. Thus we only need to calculate the entropy of a single symbol, this , in order to find the entropy rate as . Does this rate have a finite value as ? Why or why not?
- Graduate students: Suppose now the neuron has what's called a refractory period. That is, after a spike, a neuron cannot fire for the time . What is the entropy rate of such neuron?