Course Detail
Course Components:
The course will introduce basic concepts, classical models and algorithms, and advanced subjects that are necessary for the frontier research in the area of probabilistic machine learning. The students are assumed to have background in calculus, linear algebra and statistics, basic understanding in data structure and algorithm design, and minimum programming experience. The topics include Bayesian decision theory, exponential families, conjugate priors, graphical models, Bayesian kernel methods, variational inference, MCMC, variational auto-encoders, etc. After taking the class, the students are expected to understand the basic principles in probabilistic machine learning, being able to find relevant literature, self-study, and exploit existing tools for their own research or work interests, and being well prepared to dive into the cutting-edge research in this area, such as Bayesian nonparametrics, Bayesian deep learning and large-scale posterior inference.