# Physics 434, 2012: Homework 5

1. Consider a biochemical system for production of a protein that is somewhat different from what we explored in class: number of proteins $n$ increases by one with a rate $\alpha$ and it decreases with the rate $rn^{2}$ . This happens when the protein is involved in a self-degradation, as is widely believed to be true for development (Eldar et al., 2003) Derive the deterministic equation describing the system. Then derive the equations describing its stochastic dynamics: the master, the Langevin, and the Fokker-Planck equations. Let's explore how well all four of these descriptions agree with each other. Write programs to solve all four of these equations (see below for suggestions). Start your simulations with $n(t=0)=60$ and $\alpha =50$ and $r=1/50$ . Explore different values of $\alpha ,r$ and answer the following questions: (a) What are the conditions, under which all four different simulation techniques largely agree with each other? (b) Which simulations are closer to each other: Langevin vs. Fokker-Planck or Langevin vs. master equation? (c) Under which conditions is each of the two pairs close to each other? Illustrate your answers with graphs. The following should be helpful when you write your simulations code:
• Solve the deterministic chemical kinetics equation numerically using the Euler stepping method. That is: $n(t+\Delta t)=n(t)+{\frac {dn}{dt}}\Delta t$ • For the master equation, formally the array of probabilities $P_{n}$ is infinite dimensional. But the probabilities $P_{n}$ for $n\gg {\bar {n}}+10{\sqrt {\bar {n}}}$ , where ${\bar {n}}$ is the average value of $n$ obtained from the equivalent deterministic equation, are extremely small. So we may keep track of a finite set of probabilities $P_{n}$ that are not too much larger than the the expected deterministic mean. Then one can do Euler stepping again, now for each of $P_{n}$ , to calculate the dynamics of each of these probabilities.
• For the Fokker-Planck equation, one can specify $P(n,t)$ at discretized values of $n$ and approximate spatial derivatives with finite differences, as you did in calculus. And then use the Eurler method again.
• Finally, for the Langevin equation, we get a single random trajectory $n(t)$ at a time. One must then generate many such random trajectories and histogram them to get an approximation to $P(n,t)$ .
3. How much information can a spiking neuron transmit? This is limited from above by its entropy rate. Let's represent a neuron as releasing action potentials with a Poisson process with a certain rate $r$ , and let's calculate the entropy rate of the Poisson process. First represent this process by discretizing time in intervals $\Delta t$ . Explain why the entropy of the Poisson generated sequence of duration $T$ (or, alternatively, $n=T/\Delta t$ symbols) is exactly proportional to time, that is $S=sn$ , where $s$ is some constant. Thus we only need to calculate the entropy of a single symbol, this $s$ , in order to find the entropy rate as $R={\frac {sT}{\Delta tT}}$ . Does this rate have a finite value as $\Delta t\to 0$ ? Why or why not?
4. Graduate students: Suppose now the neuron has what's called a refractory period. That is, after a spike, a neuron cannot fire for the time $\tau _{r}$ . What is the entropy rate of such neuron?