Understanding Generative vs. Discriminative Models in Machine Learning

Slide Note
Embed
Share

Explore the key differences between generative and discriminative models in the realm of machine learning, including their approaches, assumptions, and applications. Delve into topics such as graphical models, logistic regression, probabilistic classifiers, and classification rules to gain insights into model verification tests and Bayesian classification.


Uploaded on Aug 30, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. ? Machine Learning (Part II) Test Angelo Ciaramella

  2. Question 27 Graphical models Question The discriminative models directly assume some functional form indirectly assume functional forms assume Gaussian distributions ML Verification tests

  3. Generative vs Discriminative Models Generative models Assume some functional form for P(X|Y), P(Y) Estimate parameters of P(X|Y), P(Y) directly from training data Use Bayes rule to calculate P(Y|X= x) Discriminative models Directly assume some functional form for P(Y|X) Estimate parameters of P(Y|X) directly from training data ML Verification tests 3

  4. Generative Model Color Size Texture Weight ML Verification tests 4

  5. Discriminative Model Logistic regression ML Verification tests Color Size Texture Weight 5

  6. Discriminative model = , = , P ( C | X ) C c c , , X (X , X ) 1 L 1 n (1x c P | ) (2x c P | ) P ( Lc | x ) Discriminative Probabilistic Classifier ML Verification tests n x 2x 1x = , x ( x , x , n x ) 1 2 6

  7. Discriminative model = , = , P ( X | C ) C c c , , X (X , X ) 1 L 1 n P x ( | 2 c ) P x ( | 1 c ) P x ( | L c ) Generative Probabilistic Model for Class 2 Generative Probabilistic Model for Class 1 Generative Probabilistic Model for Class L ML Verification tests x x x 2 x 1 x 2 x 2 x 1 x 1 x n n n = , x ( x , x , n x ) 1 2 7

  8. Bayes classifier Bayes rule Likelihood Prior Normalization Constant ML Verification tests 8

  9. MAP classification rule Maximum A Posterior rule = = = = = , * * P C ( c | X x ) P C ( c | X x ) c c , c c Lc , 1 Generative classification = = = P ( X x | C ( c = ) x P ) C ( c ) = = = i i P C ( c | X x ) ML Verification tests i P X = = = P ( X x | C c ) P C ( c ) i i , 2 , 1 = for i , L 9

  10. MAP classification rule Bayes classification = , P ( C | X ) P ( X | C ) P ( C ) P ( X , X | C ) P ( C ) 1 n Difficulty for learning the joint probability Na ve Bayes , = , , P ( X , X , X | C ) P ( X | X , X ; C ) P ( X , X | C ) 1 2 n 1 2 n 2 n ML Verification tests = , P ( X | C ) P ( X , X | C ) 1 2 n = P ( X | C ) P ( X | C ) P ( X | C ) 1 2 n all input attributes are conditionally independent 10

  11. Nave Bayes ML Verification tests

  12. Nave Bayes Learning phase (given a training set S) = , For each target value of (c c c c , ) i i 1 L ( P = = C c ) estimate P ( C c with ) examples in S ; i i = = For every attribute value x of attribute each X j j ( , 1 , n ; k , 1 , N ) j jk ( P = = = = X x | C c ) estimate P ( X x | C c with ) examples in S ; j i j i jk jk ML Verification tests Test phase = unknown istance X (1 a , , n a ) P ( P ( P ( P ( P ( P ( = , * * * * [ a | c ) a | c )] c ) [ a | c ) a | c )] c ), c c , c c , c 1 n 1 n 1 L 12

  13. Example ML Verification tests + n mp ( P = = = c X a | C c ) + j i jk n m = = n number : of training examples which for X a and C c c j i jk = n number : of training examples which for C c i / 1 = p prior : estimate (usually, p possible for t t values of X ) j 13 m weight : prior to (number of " virtual" examples, m ) 1

  14. Learning phase Outlook Play=Yes 2/9 4/9 3/9 Play=No 3/5 0/5 2/5 Temperature Play=Yes Play=No Sunny Hot 2/9 4/9 3/9 2/5 2/5 1/5 Overcast Mild Rain Cool Humidity Play=Yes Play=No Wind Play=Yes Play=No ML Verification tests Strong 3/9 6/9 3/5 2/5 High 3/9 6/9 4/5 1/5 Weak Normal P(Play=Yes) = 9/14 P(Play=No) = 5/14 14

  15. Test phase New istance x =(Outlook=Sunny, Temperature=Cool, Humidity=High, Wind=Strong) Look up table P(Outlook=Sunny|Play=No) = 3/5 P(Outlook=Sunny|Play=Yes) = 2/9 P(Temperature=Cool|Play==No) = 1/5 P(Temperature=Cool|Play=Yes) = 3/9 P(Huminity=High|Play=No) = 4/5 P(Huminity=High|Play=Yes) = 3/9 P(Wind=Strong|Play=No) = 3/5 P(Wind=Strong|Play=Yes) = 3/9 P(Play=No) = 5/14 P(Play=Yes) = 9/14 ML Verification tests MAP rule P(Yes|x ): [P(Sunny|Yes)P(Cool|Yes)P(High|Yes)P(Strong|Yes)]P(Play=Yes) = 0.0053 P(No|x ): [P(Sunny|No) P(Cool|No)P(High|No)P(Strong|No)]P(Play=No) = 0.0206 Given the fact P(Yes|x ) < P(No|x ), we label x to be No . 15

  16. Continuous inputs Normal distribution 2 ( X ) 1 P ( j 2 ji = = X | C c ) exp j i 2 2 ji ji = mean : (avearage) of attribute values X of examples which for C c ji j i = standard : deviation of attribute values X of examples which for C c ji j i ML Verification tests 16

  17. References Material Slides Video Lessons Books I. Goodfellow, Y. Bengio, A. Courville, Deep Learning, MIT Press, 2016 ML Verification tests

Related


More Related Content