<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://nemenmanlab.org/~ilya/index.php?action=history&amp;feed=atom&amp;title=Physics_434%2C_2014%3A_Homework_7</id>
	<title>Physics 434, 2014: Homework 7 - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://nemenmanlab.org/~ilya/index.php?action=history&amp;feed=atom&amp;title=Physics_434%2C_2014%3A_Homework_7"/>
	<link rel="alternate" type="text/html" href="https://nemenmanlab.org/~ilya/index.php?title=Physics_434,_2014:_Homework_7&amp;action=history"/>
	<updated>2026-05-17T10:39:21Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.31.0</generator>
	<entry>
		<id>https://nemenmanlab.org/~ilya/index.php?title=Physics_434,_2014:_Homework_7&amp;diff=585&amp;oldid=prev</id>
		<title>Ilya: 1 revision imported</title>
		<link rel="alternate" type="text/html" href="https://nemenmanlab.org/~ilya/index.php?title=Physics_434,_2014:_Homework_7&amp;diff=585&amp;oldid=prev"/>
		<updated>2018-07-04T16:28:43Z</updated>

		<summary type="html">&lt;p&gt;1 revision imported&lt;/p&gt;
&lt;table class=&quot;diff diff-contentalign-left&quot; data-mw=&quot;interface&quot;&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;1&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;1&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;Revision as of 16:28, 4 July 2018&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-notice&quot; lang=&quot;en&quot;&gt;&lt;div class=&quot;mw-diff-empty&quot;&gt;(No difference)&lt;/div&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;</summary>
		<author><name>Ilya</name></author>
		
	</entry>
	<entry>
		<id>https://nemenmanlab.org/~ilya/index.php?title=Physics_434,_2014:_Homework_7&amp;diff=584&amp;oldid=prev</id>
		<title>nemenman&gt;Ilya at 14:01, 14 November 2014</title>
		<link rel="alternate" type="text/html" href="https://nemenmanlab.org/~ilya/index.php?title=Physics_434,_2014:_Homework_7&amp;diff=584&amp;oldid=prev"/>
		<updated>2014-11-14T14:01:51Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;{{PHYS434-2014}}&lt;br /&gt;
&lt;br /&gt;
#Show that &amp;lt;math&amp;gt;I[X;Y]=S[X]- S[X|Y]&amp;lt;/math&amp;gt; is equal to &amp;lt;math&amp;gt;I[X;Y]=S[X] +S[Y]- S[X,Y]&amp;lt;/math&amp;gt;.&lt;br /&gt;
#What is the differential entropy of an exponential distribution?&lt;br /&gt;
#How much information can a spiking neuron transmit? This is limited from above by its entropy rate. Let's represent a neuron as releasing action potentials with a Poisson process with a certain rate &amp;lt;math&amp;gt;r&amp;lt;/math&amp;gt;, and let's calculate the entropy rate of the Poisson process. First represent this process by discretizing time in intervals &amp;lt;math&amp;gt;\Delta t&amp;lt;/math&amp;gt;. Explain why the entropy of the Poisson generated sequence of duration &amp;lt;math&amp;gt;T&amp;lt;/math&amp;gt; (or, alternatively, &amp;lt;math&amp;gt;n=T/\Delta t&amp;lt;/math&amp;gt; symbols) is ''exactly'' proportional to time, that is &amp;lt;math&amp;gt;S=sn&amp;lt;/math&amp;gt;, where &amp;lt;math&amp;gt;s&amp;lt;/math&amp;gt; is some constant. Thus we only need to calculate the entropy of a single symbol, this &amp;lt;math&amp;gt;s&amp;lt;/math&amp;gt;, in order to find the entropy rate as &amp;lt;math&amp;gt;R=\frac{sT}{\Delta t T}&amp;lt;/math&amp;gt;. Does this rate have a finite value as &amp;lt;math&amp;gt;\Delta t\to0&amp;lt;/math&amp;gt;? Why or why not? Estimate the maximum bitrate of a neuron that can control placement of its spikes to the accuracy of 1 ms.&lt;br /&gt;
#''Graduate students:'' Suppose now the neuron has what's called a ''refractory period''. That is, after a spike, a neuron cannot fire for the time &amp;lt;math&amp;gt;\tau_r&amp;lt;/math&amp;gt;. What is the entropy rate of such neuron?&lt;br /&gt;
#Let's explore the channel coding theorem (and also learn how to do optimization in Matlab and Octave). Suppose we have the following molecular information processing channel: for any number of molecules on the input &amp;lt;math&amp;gt;x&amp;lt;/math&amp;gt;, the channel output &amp;lt;math&amp;gt;y&amp;lt;/math&amp;gt; is a Poisson variable with that mean (this is relevant to Ziv et al, 2007, which was discussed by Martin in one of the lectures). That is, &amp;lt;math&amp;gt;P(y|x)=\frac{x^ye^{-x}}{y!}&amp;lt;/math&amp;gt;. Write a code to estimate the mutual information over this channel for an arbitrary distribution of the (discrete) input signal. Use the input distribution as an input to the function you write. Explore different input distributions, assuming that the number of input molecules is between 0 and 64.  What are the general features of the input distribution that achieve higher mutual information? Recall that Ziv et al. have shown that you should be able to send &amp;lt;math&amp;gt;1/2\log_2 \bar{N}\approx 1/2 \log_2 64/2=2.5&amp;lt;/math&amp;gt; bits through this channel. Can you find a distribution that allows you to send close to these &amp;lt;math&amp;gt;\approx 2.5&amp;lt;/math&amp;gt; bits? Submit plots of your &amp;quot;most informative&amp;quot; distributions.&lt;/div&gt;</summary>
		<author><name>nemenman&gt;Ilya</name></author>
		
	</entry>
</feed>