Difference between revisions of "Physics 212, 2018: Lecture 18"

From Ilya Nemenman: Theoretical Biophysics @ Emory
Jump to: navigation, search
nemenman>Ilya
(Created page with "{{PHYS212-2018}} ===General notes=== A good introduction to probability theory, one of my favorites, but more on the mathematical side, can be found at [http://www.dartmouth...")
 
m (1 revision imported)
 
(No difference)

Latest revision as of 11:28, 4 July 2018

Emory Logo

Back to the main Teaching page.

Back to Physics 212, 2018: Computational Modeling.

General notes

A good introduction to probability theory, one of my favorites, but more on the mathematical side, can be found at Introduction to Probability by CM Grinstead and JL Snell.

Why do we need random numbers?

  • Some processes are fundamentally random (quantum mechanics, statistical mechanics, mutations, chemical reactions).
  • Some calculations are easier done using random numbers than using deterministic approaches (e.g., calculating area of a complex object).
  • Avatars for randomness: a coin toss, a dice, a number of molecules in a certain volume of air, time to a click of a Geiger counter.

Introducing concepts of randomness

To define the necessary probabilistic concepts, we need

  • To define a set of outcomes that a random variable can take (e.g., head or tails, six sides of a die, etc.).
  • Then we define a probability of a certain outcome as a limit of frequencies after many random draws, or events. That is, if after draws, the outcome happened times, then it's frequency is , and the probability is .

Probabilities satisfy the following properties, which follow from their definition of limits of frequencies:

  • nonnegativity:
  • unit normalization:
  • nesting: if then
  • additivity (for non-disjoint events):
  • complementarity

What if we are studying more than one random variable?

Multivariate distributions is the probability of both events happening. It contains all of the information about the variables, including

  • Marginal distribution:
  • The conditional distribution, which can then be defined as , so that the probability of both events is the probability of the first happening, and then the probability of the second happening given that the first one has happened.

The conditional distributions are related using the Bayes theorem, which says: , so that .

We can also now formalize the intuitive concept of dependence among variables. Two random variables are considered to be statistically independent if and only if , or, equivalently, or .

How easy is it to generate random numbers?

  • Do exercises on this web page to get a better feel for random numbers. Were you successful in generating random numbers without the help of a coin?
  • Linear congruential method for generating random numbers. See http://apps.nrbook.com/c/index.html, Chapter 7.1 for details.
  • Many standard systems use: multiplier = 7**5, modulus = 2**31-1, increment = 0

Your turn

  • Download the rnd_generation script and play with different multiples and moduli for the linear congruential method.
  • Histogram your results. Do all values of the parameters produce good random numbers?
  • Submit your work.