Ph. D. Thesis

From Ilya Nemenman: Theoretical Biophysics @ Emory
Revision as of 20:57, 12 November 2006 by nemenman>Ilya
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Back to the full Publications list.

I Nemenman. Information Theory and Learning: A Physical Approach. PhD thesis, Princeton University, Department of Physics, 2000. PDF, arXiv.

Abstract
We try to establish a unified information theoretic approach to learning and to explore some of its applications. First, we define predictive information as the mutual information between the past and the future of a time series, discuss its behavior as a function of the length of the series, and explain how other quantities of interest studied previously in learning theory - as well as in dynamical systems and statistical mechanics - emerge from this universally definable concept. We then prove that predictive information provides the unique measure for the complexity of dynamics underlying the time series and show that there are classes of models characterized by power-law growth of the predictive information that are qualitatively more complex than any of the systems that have been investigated before. Further, we investigate numerically the learning of a nonparametric probability density, which is an example of a problem with power-law complexity, and show that the proper Bayesian formulation of this problem provides for the `Occam' factors that punish overly complex models and thus allow one to learn not only a solution within a specific model class, but also the class itself using the data only and with very few a priori assumptions. We study a possible information theoretic method that regularizes the learning of an undersampled discrete variable, and show that learning in such a setup goes through stages of very different complexities. Finally, we discuss how all of these ideas may be useful in various problems in physics, statistics, and, most importantly, biology.