<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://nemenmanlab.org/~ilya/index.php?action=history&amp;feed=atom&amp;title=Physics_380%2C_2011%3A_Lecture_10</id>
	<title>Physics 380, 2011: Lecture 10 - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://nemenmanlab.org/~ilya/index.php?action=history&amp;feed=atom&amp;title=Physics_380%2C_2011%3A_Lecture_10"/>
	<link rel="alternate" type="text/html" href="https://nemenmanlab.org/~ilya/index.php?title=Physics_380,_2011:_Lecture_10&amp;action=history"/>
	<updated>2026-05-17T09:40:01Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.31.0</generator>
	<entry>
		<id>https://nemenmanlab.org/~ilya/index.php?title=Physics_380,_2011:_Lecture_10&amp;diff=321&amp;oldid=prev</id>
		<title>Ilya: 1 revision imported</title>
		<link rel="alternate" type="text/html" href="https://nemenmanlab.org/~ilya/index.php?title=Physics_380,_2011:_Lecture_10&amp;diff=321&amp;oldid=prev"/>
		<updated>2018-07-04T16:28:41Z</updated>

		<summary type="html">&lt;p&gt;1 revision imported&lt;/p&gt;
&lt;table class=&quot;diff diff-contentalign-left&quot; data-mw=&quot;interface&quot;&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;1&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;1&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;Revision as of 16:28, 4 July 2018&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-notice&quot; lang=&quot;en&quot;&gt;&lt;div class=&quot;mw-diff-empty&quot;&gt;(No difference)&lt;/div&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;</summary>
		<author><name>Ilya</name></author>
		
	</entry>
	<entry>
		<id>https://nemenmanlab.org/~ilya/index.php?title=Physics_380,_2011:_Lecture_10&amp;diff=320&amp;oldid=prev</id>
		<title>nemenman&gt;Ilya at 02:04, 4 October 2012</title>
		<link rel="alternate" type="text/html" href="https://nemenmanlab.org/~ilya/index.php?title=Physics_380,_2011:_Lecture_10&amp;diff=320&amp;oldid=prev"/>
		<updated>2012-10-04T02:04:21Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;{{PHYS380-2011}}&lt;br /&gt;
&lt;br /&gt;
In these lectures, we cover some background on information theory. A good physics style introduction to this problem can be found in the upcoming book by Bialek (Bialek 2010). A very nice, and probably still the best, introduction to information theory as a theory of communication is (Shannon and Weaver, 1949). A standard and very good textbook on information theory is (Cover and Thomas, 2006).&lt;br /&gt;
&lt;br /&gt;
==Finishing the previous leture==&lt;br /&gt;
*Mutual information: what if we want to know about a variable &amp;lt;math&amp;gt;x&amp;lt;/math&amp;gt;, but instead are measuring a variable &amp;lt;math&amp;gt;y&amp;lt;/math&amp;gt;. How much are we learning about &amp;lt;math&amp;gt;x&amp;lt;/math&amp;gt; then? This is given by the difference of entropies of &amp;lt;math&amp;gt;x&amp;lt;/math&amp;gt; before and after the measurement: &amp;lt;math&amp;gt;\begin{array}{ll}I[X;Y]&amp;amp;=S[X]-\langle S[X|Y]\rangle_y\\&amp;amp;=S[X]+S[Y]-S[X,Y]\\&amp;amp;=\langle\log_2\frac{P(x,y)}{P(x)P(y)}\end{array}&amp;lt;/math&amp;gt;.&lt;br /&gt;
*Meaning of mutual information: mutual information of 1 bit between two variables means that by querying one of them as much as possible, we can get one bit of information about the other.&lt;br /&gt;
*Properties of mutual information&lt;br /&gt;
*#Limits: &amp;lt;math&amp;gt;0\le I[X;Y]\le \min(S[X],S[X])&amp;lt;/math&amp;gt;. Note that the first inequality becomes an equality iff the two variables are completely statistically independent.&lt;br /&gt;
*#Mutual information is well-defined for continuous variables.&lt;br /&gt;
*#Reparameterization invariance: for any &amp;lt;math&amp;gt;\xi=\xi(x),\, \eta=\eta(y)&amp;lt;/math&amp;gt;, the following is true &amp;lt;math&amp;gt;I[X;Y]=I[\Xi;\Eta]&amp;lt;/math&amp;gt;.&lt;br /&gt;
*#Data processing inequality: For &amp;lt;math&amp;gt;P(x,y,z)=P(x)P(y|x)P(z|y)&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;I[X;Z]\le \min (I[X;Y], I[Y;Z])&amp;lt;/math&amp;gt;. That is, information cannot get created in a transformation of a variable, whether deterministic or probabilistic.&lt;br /&gt;
*#Information rate: Information is also an extensive quantity, so that it makes sense to define an information rate &amp;lt;math&amp;gt;I_0=\lim_{n\to\infty}I[X_1,\dots,X_n;Y_1\dots Y_n]/n&amp;lt;/math&amp;gt;.&lt;br /&gt;
*Mutual information of a bivariate normal with a correlation coefficient &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt; is &amp;lt;math&amp;gt;I=-1/2 \log_2(1-\rho^2)&amp;lt;/math&amp;gt;.&lt;br /&gt;
*For Gaussian variables &amp;lt;math&amp;gt;y=g(x+\eta)&amp;lt;/math&amp;gt;, where &amp;lt;math&amp;gt;x&amp;lt;/math&amp;gt; is the signal, &amp;lt;math&amp;gt;y&amp;lt;/math&amp;gt; is the response, and &amp;lt;math&amp;gt;\eta&amp;lt;/math&amp;gt; is the noise related to the input, &amp;lt;math&amp;gt;I[X;Y]=\frac{1}{2}\log_2\left(1+\frac{\sigma^2_x}{\sigma^2_\eta}\right)=\frac{1}{2}\log_2(1+SNR)&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==Warmup question==&lt;br /&gt;
#For transmitting information through a synthetic transcriptional circuit in ''E. coli'' (Guet et al., 2002) -- see picture on the board -- which of the following quantities might constrain the mutual information between the chemical signal and the expressed reporter response?&lt;br /&gt;
#*The mean molecular copy number of the reporter molecule.&lt;br /&gt;
#*The mean molecular copy number of the other, non-reporter genes.&lt;br /&gt;
#*The probability distribution of the input signals.&lt;br /&gt;
&lt;br /&gt;
In this lecture, we will try to derive the limits on the quality of information processing in molecular circuits.&lt;br /&gt;
&lt;br /&gt;
==Main Lecture==&lt;br /&gt;
*We follow the discussion of Ziv et al., 2007.&lt;br /&gt;
*Consider a chain: signal s -&amp;gt; mean response &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt; -&amp;gt; actual noisy response r. Due to signal processing inequality, &amp;lt;math&amp;gt;I[S,R]\le I[\rho, R]&amp;lt;/math&amp;gt;.&lt;br /&gt;
*Assuming that &amp;lt;math&amp;gt;\rho\gg 1&amp;lt;/math&amp;gt;, and following the general formula for noise propagation from two lectures ago, we get &amp;lt;math&amp;gt;r=\rho+\eta&amp;lt;/math&amp;gt;, where &amp;lt;math&amp;gt;\sigma^2_\eta=\rho&amp;lt;/math&amp;gt;.&lt;br /&gt;
*Simple counting argument suggests that for N molecules with &amp;lt;math&amp;gt;\sqrt{N}&amp;lt;/math&amp;gt; error, there are &amp;lt;math&amp;gt;\sqrt{N}&amp;lt;/math&amp;gt; distinguishable states, and the information is limited by &amp;lt;math&amp;gt;I\approx 1/2 \log_2 N&amp;lt;/math&amp;gt;.&lt;br /&gt;
*For a fixed mean response N, we can calculate &amp;lt;math&amp;gt;P(\rho)&amp;lt;/math&amp;gt; that maximizes &amp;lt;math&amp;gt;I[\rho;R]&amp;lt;/math&amp;gt;. We get &amp;lt;math&amp;gt;P(\rho)=\frac{1}{(2\pi \rho N)^{1/2}}\exp[-\rho/(2N)]&amp;lt;/math&amp;gt;. And the information becomes &amp;lt;math&amp;gt;I\approx 1/2 \log_2 N+O(1/N)&amp;lt;/math&amp;gt;.&lt;br /&gt;
*This is the best possible result. But in a biochemical system not every &amp;lt;math&amp;gt;P(\rho)&amp;lt;/math&amp;gt; is possible. Biochemical systems are modeled typically with Hill activation/suppression dynamics &amp;lt;math&amp;gt;\frac{dx}{dt}=\frac{Vy^n}{y_0^n+y^n}&amp;lt;/math&amp;gt; (activation of x by y) or &amp;lt;math&amp;gt;\frac{dx}{dt}=\frac{V}{y_0^n+y^n}&amp;lt;/math&amp;gt; (suppression of x by y). And different values of the chemical signals may switch the binding/MIchaelis constant &amp;lt;math&amp;gt;y_0&amp;lt;/math&amp;gt;. To achieve good information transmission, one would need to make sure that mean responses to different chemical inputs are well-separated and narrow -- but one wouldn't be able to have the optimal distribution of &amp;lt;math&amp;gt;g&amp;lt;/math&amp;gt; as stated above (see picture on board how these distribution looks).&lt;br /&gt;
*Ziv et al., 2007, have shown that the inability of biochemical networks to have the optimal &amp;lt;math&amp;gt;P(g)&amp;lt;/math&amp;gt; is not very important.  Even with this constraint, the maximum information that can be transmitted through these systems is still close to  &amp;lt;math&amp;gt;\min(S[S], 1/2\log_2N)&amp;lt;/math&amp;gt;.&lt;br /&gt;
*Biochemical systems can be very efficient in transmitting information, with their intrinsic stochastic noisiness being, essentially, the main constraint.&lt;/div&gt;</summary>
		<author><name>nemenman&gt;Ilya</name></author>
		
	</entry>
</feed>