Difference between revisions of "Nemenman et al., 2004"
m (1 revision imported)
Latest revision as of 12:28, 4 July 2018
- The major problem in information theoretic analysis of neural responses is the reliable estimation of entropy-like quantities from small samples. We review a Bayesian estimator of entropies introduced recently [Nemenman et al., 2002] to solve this problem, and study its performance on synthetic and experimental spike trains. The estimator performs admirably even very deep in the undersampled regime, where other techniques fail. This opens new possibilities for the information theoretic analysis of experiments, and may be of general interest as an example of learning from limited data.