<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://nemenmanlab.org/~ilya/index.php?action=history&amp;feed=atom&amp;title=Physics_380%2C_2011%3A_Homework_6</id>
	<title>Physics 380, 2011: Homework 6 - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://nemenmanlab.org/~ilya/index.php?action=history&amp;feed=atom&amp;title=Physics_380%2C_2011%3A_Homework_6"/>
	<link rel="alternate" type="text/html" href="https://nemenmanlab.org/~ilya/index.php?title=Physics_380,_2011:_Homework_6&amp;action=history"/>
	<updated>2026-05-17T09:39:53Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.31.0</generator>
	<entry>
		<id>https://nemenmanlab.org/~ilya/index.php?title=Physics_380,_2011:_Homework_6&amp;diff=323&amp;oldid=prev</id>
		<title>Ilya: 1 revision imported</title>
		<link rel="alternate" type="text/html" href="https://nemenmanlab.org/~ilya/index.php?title=Physics_380,_2011:_Homework_6&amp;diff=323&amp;oldid=prev"/>
		<updated>2018-07-04T16:28:41Z</updated>

		<summary type="html">&lt;p&gt;1 revision imported&lt;/p&gt;
&lt;table class=&quot;diff diff-contentalign-left&quot; data-mw=&quot;interface&quot;&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;1&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;1&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;Revision as of 16:28, 4 July 2018&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-notice&quot; lang=&quot;en&quot;&gt;&lt;div class=&quot;mw-diff-empty&quot;&gt;(No difference)&lt;/div&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;</summary>
		<author><name>Ilya</name></author>
		
	</entry>
	<entry>
		<id>https://nemenmanlab.org/~ilya/index.php?title=Physics_380,_2011:_Homework_6&amp;diff=322&amp;oldid=prev</id>
		<title>nemenman&gt;Ilya at 02:07, 3 October 2011</title>
		<link rel="alternate" type="text/html" href="https://nemenmanlab.org/~ilya/index.php?title=Physics_380,_2011:_Homework_6&amp;diff=322&amp;oldid=prev"/>
		<updated>2011-10-03T02:07:47Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;{{PHYS380-2011}}&lt;br /&gt;
#Analytically calculate the entropy of a Gaussian random variable.&lt;br /&gt;
#We start with a simple problem. In class, we have defined the mutual information between &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; as a difference between a marginal and a conditional entropy, &amp;lt;math&amp;gt;I[X;Y]=S[X]- S[X|Y]&amp;lt;/math&amp;gt;. Rewrite this expression to depend only on unconditional entropies. What does it say about the relation between the joint entropy of two variables and the two marginal entropies?&lt;br /&gt;
#How much information can a spiking neuron transmit? This is limited from above by its entropy rate. Let's represent a neuron as releasing action potentials with a Poisson process with a certain rate &amp;lt;math&amp;gt;r&amp;lt;/math&amp;gt;, and let's calculate the entropy rate of the Poisson process. First represent this process by discretizing time in intervals &amp;lt;math&amp;gt;\Delta t&amp;lt;/math&amp;gt;. Explain why the entropy of the Poisson generated sequence of duration &amp;lt;math&amp;gt;T&amp;lt;/math&amp;gt; (or, alternatively, &amp;lt;math&amp;gt;n=T/\Delta t&amp;lt;/math&amp;gt; symbols) is exactly proportional to time, that is &amp;lt;math&amp;gt;S=sn&amp;lt;/math&amp;gt;, where &amp;lt;math&amp;gt;s&amp;lt;/math&amp;gt; is some constant. Thus we only need to calculate the entropy of a single symbol, this &amp;lt;math&amp;gt;s&amp;lt;/math&amp;gt;, in order to find the entropy rate as &amp;lt;math&amp;gt;R=\frac{sT}{\Delta t T}&amp;lt;/math&amp;gt;. Does this rate have a finite value as &amp;lt;math&amp;gt;\Delta t\to0&amp;lt;/math&amp;gt;? Why or why not?&lt;br /&gt;
#''Graduate students:'' Suppose now the neuron has what's called a ''refractory period''. That is, after a spike, a neuron cannot fire for the time &amp;lt;math&amp;gt;\tau_r&amp;lt;/math&amp;gt;. What is the entropy rate of such neuron? &lt;br /&gt;
#Consider information transmission in a simple model of a molecular circuit. The input signal of &amp;lt;math&amp;gt;s&amp;lt;/math&amp;gt; molecules is distributed as a Gaussian random variable with the mean &amp;lt;math&amp;gt;\mu_s&amp;lt;/math&amp;gt; and the variance &amp;lt;math&amp;gt;\sigma_s&amp;lt;/math&amp;gt;. The elicited mean response is &amp;lt;math&amp;gt;\mu_r=g\mu_s&amp;lt;/math&amp;gt;, and it has the usual counting (a.k.a, Poisson or square root) fluctuations around the mean. Assuming that the response molecule count is large, &amp;lt;math&amp;gt;r\gg1&amp;lt;/math&amp;gt;, what is the mutual information between &amp;lt;math&amp;gt;s&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;r&amp;lt;/math&amp;gt;? (To answer this, you will need to understand what the marginal distribution of &amp;lt;math&amp;gt;r&amp;lt;/math&amp;gt; is.) Now suppose that there are &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; independent signals &amp;lt;math&amp;gt;s_n&amp;lt;/math&amp;gt;, and each is measured by an independent response &amp;lt;math&amp;gt;r_n&amp;lt;/math&amp;gt;, such that, as above, &amp;lt;math&amp;gt;\mu_{r_n}=g_n\mu_{s_n}&amp;lt;/math&amp;gt;. What is the total mutual information between all signals on the one hand, and all responses on the other?&lt;/div&gt;</summary>
		<author><name>nemenman&gt;Ilya</name></author>
		
	</entry>
</feed>