Moocable is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Bayesian Inference with MCMC

Description

The objective of this course is to introduce Markov Chain Monte Carlo Methods for Bayesian modeling and inference, The attendees will start off by learning the the basics of Monte Carlo methods. This will be augmented by hands-on examples in Python that will be used to illustrate how these algorithms work. This will be the second course in a specialization of three courses .Python and Jupyter notebooks will be used throughout this course to illustrate and perform Bayesian modeling with PyMC3. The course website is located at https://sjster.github.io/introduction_to_computational_statistics/docs/index.html. The course notebooks can be downloaded from this website by following the instructions on page https://sjster.github.io/introduction_to_computational_statistics/docs/getting_started.html.

The instructor for this course will be Dr. Srijith Rajamohan.

Tags

Syllabus

  • Topics in Model Performance
    • This module gives an overview of topics related to assessing the quality of models. While some of these metrics may be familiar to those with a Machine Learning background, the goal is to bring awareness to the concepts rooted in Information Theory. The course website is https://sjster.github.io/introduction_to_computational_statistics/docs/Production/BayesianInference.html. Instructions to download and run the notebooks are at https://sjster.github.io/introduction_to_computational_statistics/docs/Production/getting_started.html
  • The Metropolis Algorithms for MCMC
    • This module serves as a gentle introduction to Markov-Chain Monte Carlo methods. The general idea behind Markov chains are presented along with their role in sampling from distributions. The Metropolis and Metropolis-Hastings algorithms are introduced and implemented in Python to help illustrate their details. The course website is https://sjster.github.io/introduction_to_computational_statistics/docs/Production/MonteCarlo.html. Instructions to download and run the notebooks are at https://sjster.github.io/introduction_to_computational_statistics/docs/Production/getting_started.html
  • Gibbs Sampling and Hamiltonian Monte Carlo Algorithms
    • This module is a continuation of module 2 and introduces Gibbs sampling and the Hamiltonian Monte Carlo (HMC) algorithms for inferring distributions. The Gibbs sampler algorithm is illustrated in detail, while the HMC receives a more high-level treatment due to the complexity of the algorithm. Finally, some of the properties of MCMC algorithms are presented to set the stage for Course 3 which uses the popular probabilistic framework PyMC3. The course website is https://sjster.github.io/introduction_to_computational_statistics/docs/Production/MonteCarlo.html#gibbs-sampling.
      Instructions to download and run the notebooks are at https://sjster.github.io/introduction_to_computational_statistics/docs/Production/getting_started.html

Bayesian Inference with MCMC

Affiliate notice

  • Type
    Online Courses
  • Provider
    Coursera

The objective of this course is to introduce Markov Chain Monte Carlo Methods for Bayesian modeling and inference, The attendees will start off by learning the the basics of Monte Carlo methods. This will be augmented by hands-on examples in Python that will be used to illustrate how these algorithms work. This will be the second course in a specialization of three courses .Python and Jupyter notebooks will be used throughout this course to illustrate and perform Bayesian modeling with PyMC3. The course website is located at https://sjster.github.io/introduction_to_computational_statistics/docs/index.html. The course notebooks can be downloaded from this website by following the instructions on page https://sjster.github.io/introduction_to_computational_statistics/docs/getting_started.html.

The instructor for this course will be Dr. Srijith Rajamohan.

  • Topics in Model Performance
    • This module gives an overview of topics related to assessing the quality of models. While some of these metrics may be familiar to those with a Machine Learning background, the goal is to bring awareness to the concepts rooted in Information Theory. The course website is https://sjster.github.io/introduction_to_computational_statistics/docs/Production/BayesianInference.html. Instructions to download and run the notebooks are at https://sjster.github.io/introduction_to_computational_statistics/docs/Production/getting_started.html
  • The Metropolis Algorithms for MCMC
    • This module serves as a gentle introduction to Markov-Chain Monte Carlo methods. The general idea behind Markov chains are presented along with their role in sampling from distributions. The Metropolis and Metropolis-Hastings algorithms are introduced and implemented in Python to help illustrate their details. The course website is https://sjster.github.io/introduction_to_computational_statistics/docs/Production/MonteCarlo.html. Instructions to download and run the notebooks are at https://sjster.github.io/introduction_to_computational_statistics/docs/Production/getting_started.html
  • Gibbs Sampling and Hamiltonian Monte Carlo Algorithms
    • This module is a continuation of module 2 and introduces Gibbs sampling and the Hamiltonian Monte Carlo (HMC) algorithms for inferring distributions. The Gibbs sampler algorithm is illustrated in detail, while the HMC receives a more high-level treatment due to the complexity of the algorithm. Finally, some of the properties of MCMC algorithms are presented to set the stage for Course 3 which uses the popular probabilistic framework PyMC3. The course website is https://sjster.github.io/introduction_to_computational_statistics/docs/Production/MonteCarlo.html#gibbs-sampling.
      Instructions to download and run the notebooks are at https://sjster.github.io/introduction_to_computational_statistics/docs/Production/getting_started.html

Related Courses