
Gaussian Mixture Model and EM Recitation Overview
Explore the concepts of Gaussian Mixture Model (GMM) and Expectation Maximization (EM) through recitation slides covering motivation, formulation, definitions, and detailed steps of EM algorithm. Understand how GMM works as a distribution and dive into the intricacies of EM for inference and learning. Delve into the E-step and M-step processes and witness the progression through the Coordinate Ascent Loop until convergence.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
GMM and EM 10701 Recitation Pengtao Xie 3/20/2025 1
Outline Gaussian Mixture Model Expectation Maximization 3/20/2025 2
Motivation 3/20/2025 3
Formulation { }N i i x = 1 K clusters N K ( ) | , k k = 1 k { }N i i z = 1 }K { | ( p z = 1) = = 1 k k k k K = ( ) p x ( ) N x| , k k k = 1 k 3/20/2025 4
GMM is a distribution 3/20/2025 5
GMM is a distribution 3/20/2025 6
Outline Gaussian Mixture Model Expectation Maximization 3/20/2025 7
Definitions Input data, local variables, model parameters, inference, learning 3/20/2025 8
Expectation Maximization = log ( ) p x log ( , ) p x z z ( , ) ( ) q z p x z = log ( ) q z z ( , ) ( ) q z p x z = log E q ( , ) ( ) q z p x z p x z E log q = E log ( , ) E log ( ) q z q q = Bound is tight if ( ) (z| x) q z p 3/20/2025 9
Expectation Maximization = ( ) q z (z|x) p = = + log ( ) p x E E log ( , ) log ( | ) p x z E E log ( ) log ( ) p x z q z p z q q E log ( ) q z q q q K K = + z z k E log ( | x , ) E log E log ( ) q z N k k q k k q q = = 1 1 k k 3/20/2025 10
Expectation Maximization K K + z z k max E log ( | x , ) E log E log ( ) q z N k k ( ), , , q z q k k q q = = 1 1 k k Coordinate Ascent Loop until convergence { ( | ) p z x 1. Fix , estimate , , ( | ) p z x E step , , 2. Fix , estimate M step } 3/20/2025 11
E Step (x |z)p(z) (x) p p = (z| x) p = = (x |z 1)p(z (x) , k 1) p = = (z 1| ) k k p x k p ( | x ) N = k k K ( ) N x| , k k k = 1 k 3/20/2025 12
M Step K K N + z z k max E log ( | , ) E log E log ( ) q z N x nk nk ( ), , , q z q n k k q q n = = = 1 n 1 1 k k 3/20/2025 13
Slides courtesy Pattern Recognition and Machine Learning, Bishop Suggested Readings Pattern Recognition and Machine Learning, Bishop, 9.2 and 9.3 3/20/2025 14