Skip to content

Third re-build of STA410 Statistical Computation / STA2102 Computational Techniques in Statistics

Notifications You must be signed in to change notification settings

pointOfive/STA410_W24_StatisticalComputation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

STA410_W24_StatisticalComputation

Third re-build of STA410 Statistical Computation / STA2102 Computational Techniques in Statistics

Course Topics

  1. Sampling: Inverse CDF, Rejection, and Importance Sampling

    1. Lecture Notebook
    2. Coding Demo: python speed
    3. Homework: Modulus Recursion
    4. Extra Coding: New to python?
    5. Extra Reading: Pseudorandomnes and Floating-point numbers
  2. Estimation: Monte Carlo (MC) integration, estimation error, improving efficiency, antithetic sampling and control variates (correlation)

    1. Lecture Notebook
    2. Coding Demo: Adaptive Squeezed Rejection Sampling
    3. Homework: Integration Estimation
    4. Extra Reading: Integral Approximation
    5. Extra Reading: Importance Sampling Bias
    6. Extra Coding: Importance Sampling Hidden Markov Models (HMMs)
    7. Extra Reading: HMMs
  3. Markov Chain Monte Carlo (MCMC): High dimensional integration, Gibbs Sampling, Slice Sampling, Metropolis-Hastings, PyMC, Hamiltonian Monte Carlo (HMC)

    1. Lecture Notebook
    2. Coding Demo: Hamiltonian Monte Carlo with PyMC
    3. Homework: Probabilistic Programming
    4. Extra Coding: PyMC python
    5. Extra Reading: MCMC Diagnostics and Theory
  4. Numerical precision and error and condition and linear algebra (floating point behaviour and SVD)

    1. Lecture Notebook
    2. No Coding Demo this week and we'll have a long lecture instead; the prerequesite reading becomes important for the end of this lecture and relevance continues into future material; what was being considered for the Coding Demo has instead just remained as part of the Homework [so the homework is a little longer in length than usual]
    3. Prerequesites: Linear Algebra
    4. Homework: Numerical Precision for Means and Variances
    5. Extra Reading: Analog versus Digital Arithmatic
  5. Linear Algebra: SVD/PCA/ICA/PRC, Condition, Regression VIFs, and Matrix Decompositions for Least Squares

    1. Prerequesites: Linear Algebra [Still (or now actually probably Even More) applicable compared to Last Week...]
    2. Lecture Notebook
    3. Coding Demo: Least Squares
    4. Homework: Randomized Linear Algebra
    5. Extra Coding: Gram-Schmidt and the Cholesky
    6. Extra Coding: More Least Squares
    7. Extra Reading: Computational Speed and Complexity
    8. Extra Reading: Matrix Condition Numbers
  6. Coding Challenge

  7. Reading Week

  8. Midterm

  9. From (Week 5) Direct Methods to Iterative Methods: Gauss-Seidel (GS), Successive Overrelaxation, Coordinate Descent (AKA Nonlinear GS), and Gradient Descent and AutoDiff

    1. Coding Demo: Splines, smoothing matrices (lowess/loess), generalized additive models (GAMs)
      [including some extra broader contextual material on basis functions and regularization and penalty functions]
    2. Lecture Notebook
    3. Homework: Gradient Descent
    4. Extra Reading: Line Search to find optimal step sizes and Conjugate Gradient Descent
    5. Extra Coding: Conjugate Gradient Descent
    6. Extra Reading: Function Spaces
    7. Extra Coding: Lagrange Polynomial Interpolation
  10. Optimization, Hessians and Jacobians, Gauss-Newton, Maximum Likelihood Estimation (score function, etc.) and Fisher Scoring and Newton's Method

    1. Lecture Notebook
    2. (+ iii) Coding Demo / Homework Notebook: classical optimization methods in TensorFlow
      (with Nonlinear Gauss-Seidel, Gradient Descent, Gauss-Newton, Fisher Scoring, and Newton's Method)
    3. ^
    4. Extra Reading: Variants on Newton's Method and Convergence Considerations
    5. Extra Coding: Newton's Method versus Secant, Fixed-Point Iteration, etc.
  11. Modern optimizers; but, IRLS (iteratively reweighted least squares) AKA Newton's Method / Fisher Scoring for M and Quasi-Likelihood estimation are still relevant... (and the "Sandwich Estimator")

    1. Lecture Notebook
    2. Ziang Coding Demo: fitting poisson regression models with Quasi-Likelihood estimation
    3. Draft Coding Demo: fitting poisson regression models with Quasi-Likelihood estimation
    4. Homework: Logistic Regression via IRLS
    5. Extra Coding: Huber Loss
    6. Extra Topic: Conjugate Gradient Descent for "double iterative" Truncated Newton's method
      1. See Week 6 Extra Reading on Conjugate Gradient Descent and Week 6 Extra Coding on Conjugate Gradient Descent
    7. Extra Reading: A few more notes
  12. Generative Modeling: NNs and the Likelihood Principle and Masked Autoencoder Density Estimation (MADE), KL divergence, Variational Inference (VI), Expectation-Maximiation (EM), Variational Autoencoder, Bayes by Backprop, Normalizing Flows (MAFs, IAFs, and Real NVP), and Simulated Annealing

    1. Lecture Notebook
    2. Extra Reading: Copulas
  13. Coding Challenge

  14. Final

About

Third re-build of STA410 Statistical Computation / STA2102 Computational Techniques in Statistics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published