Understanding Concept Learning and Version Spaces in Machine Learning

Slide Note
Embed
Share

In the field of machine learning, concept learning involves inferring general definitions of concepts from labeled examples. This process aims to approximate the best concept description from a set of possible hypotheses. The concept learning approach is illustrated through examples, such as predicting whether a day is suitable for water sports based on various attributes. Hypotheses are represented as conjunctions of constraints over instance attributes, allowing classification based on given criteria.


Uploaded on Aug 30, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. MACHINE LEARNING MACHINE LEARNING LECTURE 2 LECTURE 2 CONCEPT LEARNING AND VERSION SPACES CONCEPT LEARNING AND VERSION SPACES Based on Chapter 2 of Mitchell s book 1

  2. OUTLINE Concept Learning from examples Hypothesis representation General-to specific ordering of hypotheses the Find-S algorithm 2

  3. WHAT IS A CONCEPT? Assume a given domain, e.g. objects, animals, etc. A Concept is a subset of the domain, e.g. birds animals Birds Things Animals Cars Generally we can t look at all objects in the domain 3

  4. WHAT IS A CONCEPT? A Concept could be defined as a Boolean-valued function C defined over the larger set Example: a function defined over all animals whose value is true for birds and false for every other animal. c(bird) = true c(cat) = false c(car) = false c(pigeon) = true 4

  5. WHAT IS CONCEPT LEARNING? Given a set of examples labeled as members (positive) or non-members (negative) of a concept: Infer the general definition of this concept Approximate c from training examples: Infer the best concept-description from the set of all possible hypotheses 5

  6. EXAMPLE OF A CONCEPT LEARNING TASK Concept: Good days on which my friend enjoys water sport (values: Yes, No) Task: predict the value of Enjoy Sport for an arbitrary day based on the values of the other attributes Result: classifier for days attributes Sky Temp Humid Wind Water Forecast Enjoy Sport instance Sunny Sunny Rainy Sunny Warm Warm Cold Warm Normal High High High Strong Strong Strong Strong Warm Warm Warm Cool Same Same Change Change Yes Yes No Yes 6

  7. REPRESENTING HYPOTHESES Many possible hypothesis (h) representations The simplest h could be represented as a conjunction of constraints over the instance attributes Each constraint can be: a specific value (e.g., Water = Warm) ?: don t care/any value (e.g., Water=?) no value allowed (e.g., Water= ) Example: hypothesis h Sky Temp Humid Wind Water Forecast < Sunny ? ? Strong ? Same > We say h(x)=1 for a day x, if x satisfies the description 7

  8. MOST GENERAL/SPECIFIC HYPOTHESIS every day is a positive example No day is a positive example 8

  9. PROTOTYPICAL CONCEPT LEARNING TASK GIVEN: Instances X: Possible days, each described by the attributes sky, AirTemp, Humidity, Wind, Water, Forecast Target function c: EnjoySport: X {0,1} Hypotheses H: Conjunctions of constraints on attributes <?, Cold, High, ?, ?, ?> Training examples D: positive and negative examples of the target function <x1,c(x1)>,<x2,c(x2)>, ,<xn, c(xn)> DETERMINE: A hypothesis h in H with h(x)=c(x) for all x in D 9

  10. THE INDUCTIVE LEARNING HYPOTHESIS Any hypothesis found to approximate the target function well over the training examples, will also approximate the target function well over the unobserved examples. Tom Mitchell Assumptions for Inductive Learning Algorithms: The training sample represents the population The input attributes permit discrimination 10

  11. NUMBER OF INSTANCES, CONCEPTS, HYPOTHESES Learning can be viewed as a task of searching a large space Sky: Sunny, Cloudy, Rainy AirTemp: Warm, Cold Humidity: Normal, High Wind: Strong, Weak Water: Warm, Cold Forecast: Same, Change #distinct instances : 3*2*2*2*2*2 = 96 #distinct concepts : 296 #syntactically distinct hypotheses : 5*4*4*4*4*4=5120 (general and specific cases are added) #semantically distinct hypotheses : 1+4*3*3*3*3*3=973 11

  12. CONCEPT LEARNING AS SEARCH Learning can be viewed as a task of searching a large space Different learning algorithms search this space in different ways! 12

  13. GENERAL-TO-SPECIFIC ORDERING OF HYPOTHESIS Many algorithms rely on ordering of hypothesis Consider two hypotheses: h1=(Sunny, ?, ?, Strong, ?,?) h2= (Sunny, ?,?,?,?,?) h2 imposes fewer constraints than h1: it classifies more instances x as positive h(x)=1 h2 is more general than h1! How to formalize this? 13

  14. GENERAL-TO-SPECIFIC ORDERING OF HYPOTHESIS Many algorithms rely on ordering of hypothesis Consider two hypotheses: h1=(Sunny, ?, ?, Strong, ?,?) h2= (Sunny, ?,?,?,?,?) h2 imposes fewer constraints than h1: it classifies more instances x as positive h(x)=1 h2 is more general than h1! How to formalize this? h2 is more general than h1 iff h2(x)=1 h1(x)=1 We note it h2 g h1 14

  15. INSTANCE, HYPOTHESES, AND MORE-GENERAL Instances Hypotheses specific x1 h1 h3 h2 h1 h2 h3 h2 x2 general x1=< Sunny,Warm,High,Strong,Cool,Same> h1=< Sunny,?,?,Strong,?,?> x2=< Sunny,Warm,High,Light,Warm,Same> h2=< Sunny,?,?,?,?,?> h3=< Sunny,?,?,?,Cool,?> 15 The g relation is important as it provides a structure over the hypothesis space.

  16. FIND-S ALGORITHM Initialize h to the most specific hypothesis in H: For each positive training instance x: For each attribute constraint ai in h: If the constraint is not satisfied by x Then replace ai by the next more general constraint satisfied by x Output hypothesis h 16

  17. FIND-S Example Sky AirTemp Humidity Wind Water Forecast EnjoySport 1 Sunny Warm Normal Strong Warm Same Yes 2 Sunny Warm High Strong Warm Same Yes 3 Rainy Cold High Strong Warm Change No 4 Sunny Warm High Strong Cool Change Yes h = < , , , , , > h = <Sunny, Warm, Normal, Strong, Warm, Same> h = <Sunny, Warm, ? , Strong, Warm, Same> h = <Sunny, Warm, ? , Strong, ? , ? > 17

  18. FIND-S Example Sky AirTemp Humidity Wind Water Forecast EnjoySport 1 Sunny Warm Normal Strong Warm Same Yes 2 Sunny Warm High Strong Warm Same Yes 3 Rainy Cold High Strong Warm Change No 4 Sunny Warm High Strong Cool Change Yes h = <Sunny, Warm, ? , Strong, ? , ? > Prediction 5 Rainy Cold High Strong Warm Change No No 6 Sunny Warm Normal Strong Warm Same Yes Yes 7 Sunny Warm Low Strong Cool Same Yes Yes 18

Related


More Related Content