Physics 434, 2015: Homework 10

1. Let's explore the channel coding theorem (and also learn how to do optimization in Matlab and Octave). Suppose we have the following molecular information processing channel: for any number of molecules on the input ${\displaystyle x}$, the channel output ${\displaystyle y}$ is a Poisson variable with that mean (this is relevant to Ziv et al, 2007, which was discussed by Martin in one of the lectures). That is, ${\displaystyle P(y|x)={\frac {x^{y}e^{-x}}{y!}}}$. Write a code to estimate the mutual information over this channel for an arbitrary distribution of the (discrete) input signal. Use the input distribution as an input to the function you write. Explore different input distributions, assuming that the number of input molecules is between 0 and 64. What are the general features of the input distribution that achieve higher mutual information? Recall that Ziv et al. have shown that you should be able to send ${\displaystyle 1/2\log _{2}{\bar {N}}\approx 1/2\log _{2}64/2=2.5}$ bits through this channel. Can you find a distribution that allows you to send close to these ${\displaystyle \approx 2.5}$ bits? Submit plots of your "most informative" distributions.