Skip to content

Latest commit

 

History

History
executable file
·
319 lines (214 loc) · 7.9 KB

syllabus.md

File metadata and controls

executable file
·
319 lines (214 loc) · 7.9 KB

Section 1: Introduction (Devore, 2010; Stark and Woods, 1994; Tijms, 2012)

1.1 History of probability

  • Understaning probability, Trijms, 2012

1.2 Populations samples and processes

  • population, census, sample
  • univariate, bivariate, multivariate
  • relation between probability and statistics

1.3 Descriptive statistics

  • stem-and-leaf displays
  • historgrams

1.4 Measures of location

  • mean
  • median

1.5 Measures of variability

  • sample variance
  • box plots

Section 2: Probability (Devore, 2010; Stark and Woods, 1994; Tijms, 2012)

2.1 Sample space and events

  • experiment
  • sample space
  • events
  • set theory

2.2 Axioms, interpretation and properties of probability

  • axioms
  • interpretation
  • probability properties

2.3 Counting techniques

  • product rule for ordered pairs
  • permutations and combinations

2.4 Conditional probability

  • conditional probability
  • the multiplication rule
  • law of total probability
  • Bayes' theorem

2.5 Independence

  • independence
  • the multiplication rule for $P(A\cap B)$
  • independence of more than two events

Section 3: Discrete random variables and probability distributions (Devore, 2010)

3.1 Random variables

  • random variables
  • discrete and continuous random variables

3.2 Probability distributions for discrete random variables

  • probability mass function
  • cummulative distribution function

3.3 Expected values

  • expected value of X
  • rules of expected values
  • law of total expectation
  • variance and standard deviation of X
  • rules of variance
  • law of total variance

3.4 The Binomial probability distribution

  • the Binomial experiment
  • the Binomial random variable and distribution
    • random variable
    • probability mass function
    • expected value
    • variance

3.6 The Poisson probability distribution

  • the Poisson distribution
  • the Poisson distribution as a limit of the Binomial distribution
  • the mean and variance of X
  • the Poisson process

Section 4: Continous random variables and probability distributions (Devore, 2010)

4.1 Probability density functions

  • the probability density function

4.2 Cumulative distribution functions and expected values

  • the cumulative distribution function
  • obtaining f(x) from F(x)
  • percentils of a continuous distribution
  • expected values
  • law of total expectation
  • variance and standard deviation
  • law of total variance

4.3 Normal distribution

  • the normal distribution
  • percentiles of the standard normal distribution
  • critical values

4.4 The exponential distribution

  • the exponential distribution
    • pdf
    • mean
    • variance

4.6 Probability plots (useful application of the previous concepts in this chapter)

  • sample percentils
  • probability plots

Section 5: Functions of random variables (Stark and Woods, 1994)

3.1 Introduction

  • examples

3.2 Solving problems of the type Y = g(X)

  • examples

  • general formula of detmining the pdf of Y = g(X)


Section 6: Joint probability distributions and random samples (Devore 2010)

5.1 Jointly distributed random

  • two discrete random variables

    • joint probability mass function
    • marginal probability mass function
  • two continuous random variables

    • joint probability density function
    • marginal probability density function
  • independent random variables

  • more than two random variables

    • independence
  • conditional distributions

5.2 Expected values, covariance, and correlation

  • E{h(X, Y)}
  • Cov(X, Y)
  • correlation coefficient

5.3 Statistics and their distributions

  • statistics and sampling distributions
  • random samples
  • deriving simple sampling distributions
  • simulation experiments

5.4 The distribution of the sample mean

  • general distribution of the sample mean
  • the Normal case
  • the distribution of a linear combination

Section 7: Limit theorems (Ross, 2010)

8.2 Chebyshev's inequality and the weak law of large numbers

  • Markov's inequality

  • Chevishev's inequality

  • The weak law of large numbers

8.3 The central limit theorem

  • The central limit theorem for iid RVs (with proof)

  • The central limit theorem for independent RVs

8.4 The strong law of large numbers

  • The strong law of large numbers

Section 8: Vector random variables (Stark and Woods, 1994; Bishop, 2006)

5.1 Join distribution and densities (S&W, 1994)

  • probability distribution function (PDF)
  • probability density function (pdf)

5.2 Expectation vectors and covariance matrices (S&W, 1994)

  • expectation vector
  • covariance matrix

5.3 Properties of covariance matrices (S&W, 1994)

  • positive definite matrices

5.4 The multidimensional Gaussian law (S&W, 1994)

  • multidimensional normal pdf
  • distribution of Y=AX, when X is an n-dimensional normal random vector
  • sampling (correlated) multidimensional normal random vectors

2.3.1 Conditional Gaussian distributions (Bishop, 2006)

2.3.2 Marginal Gaussian distributions (Bishop, 2006)


Section 9: Inference (Casella and Berger, 2002; Bishop, 2006)

6.3.1 The Likelihood function (C&B, 2002)

7.2.2 Maximum likelihood estimators (C&B, 2002)

2.3.3 Inference in the linear Gaussian model (Bishop, 2006) (exercise, solution)

3.1.1 Maximum likelihood and least squares linear regression (Bishop, 2006)

3.3 Bayesian linear regression (Bishop, 2006)


Bibliography

  • Bishop, C. M., & Nasrabadi, N. M. (2006). Pattern recognition and machine learning (Vol. 4, No. 4, p. 738). New York: springer.

  • Casella, G., & Berger, R. L. (2002). Statistical inference, Duxbury Press. Pacific Grove, CA.

  • Devore, J. L. (2011). Probability and Statistics for Engineering and the Sciences. Cengage learning.

  • Stark, H., & Woods, J. W. (Eds.). (2002). Probability, random processes, and estimation theory for engineers. Prentice-Hall, Inc., 2nd Ed.

  • Tijms, H. (2012). Understanding probability. Cambridge University Press.