Genetic Algorithm for Attribute Selection in Data Mining

Slide Note
Embed
Share

Genetic algorithm (GA) is a powerful method for attribute selection in data mining as it efficiently explores numerous attribute combinations. By choosing the most important features and ignoring the rest, GA can enhance the data analysis process through methods like feature extraction and artificial adaptation. The process involves representing solutions as chromosomes, defining fitness functions, determining crossover and mutation probabilities, and selecting chromosomes for reproduction based on their fitness levels. Through simple fitness functions and problem setups, GA can discover optimal solutions for maximizing fitness. Strategies like selecting chromosomes for refinement and generating initial populations contribute to the successful application of GA in attribute selection.


Uploaded on Sep 13, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Genetic algorithm for attribute selection Find the best subset of attributes for data mining GA is well suited to this task since it can explore many combinations of attributes.

  2. Feature selection: Chose k<d important features, ignore the remaining d k Genetic algorithm Information gained Feature extraction: Project the original d attributes onto a new k<d dimensional feature space Principal components analysis (PCA), Linear discriminant analysis (LDA), Factor analysis (FA) Auto-association ANN 2

  3. Artificial adaptation : Represent a candidate solution to a problem by a chromosome Define a fitness function on the domain of all chromosomes Define the probabilities of crossover and mutation. Select 2 chromosomes for reproduction based on their fitness Produce new chromosomes by crossover and mutation Evaluate fitness of new chromosomes This completes a generation Repeat until most-fit chromosome is dominant in population

  4. Simple fitness function: use GA to discover that maximum fitness is at x=16

  5. Problem set up Chromosome = binary representation of integers between 0 and 31 (requires 5 bits) 0 to 31 covers the range where fitness is significantly different from zero Fitness of chromosome = value of fitness function f(x) where x is the integer equivalent of a 5-bit binary Crossover probability = 0.75 Mutation probability = 0.002 Size of population, n = 4

  6. Selecting chromosomes for refinement Calculate fitness f(xi) for each chromosome in population Assigned each chromosome a discrete probability by x f p ) ( ( ) i = i n = i f x i 1 Divide number line between 0 and 1 into segments of length pi in a specified order(usually increasing size) Get r, random number uniformly distributed between 0 and 1 Choose the chromosome of the line segment containing r

  7. 1st generation: 5-bit binary numbers chosen randomly 00100 = 4 01001 = 9 11011 = 27 fitness = 0.0023 11111 = 31 fitness = 0.0001 fitness = 0.0011 fitness = 0.0216 pi = 0.044 pi = 0.861 pi = 0.091 pi = 0.004 i f(xi) = 0.0251 1 2 3 4 |\ \ \ |---------|---------|---------|---------|---------|---------|---------|---------|--- 0 .1 .2 .3 .4 .5 .6 .7 .8 What is the range of random numbers that will lead to the selection of 11011? What is the range of random numbers that will lead to the selection of 01001?

  8. 1st generation: 5-bit binary numbers chosen randomly 00100 = 4 01001 = 9 11011 = 27 fitness = 0.0023 11111 = 31 fitness = 0.0001 fitness = 0.0011 fitness = 0.0216 pi = 0.044 pi = 0.861 pi = 0.091 pi = 0.004 i f(xi) = 0.0251 1 2 3 4 |\ \ \ |---------|---------|---------|---------|---------|---------|---------|---------|--- 0 .1 .2 .3 .4 .5 .6 .7 .8 What is the range of random numbers that will lead to the selection of 11011? (0.048 to 0.139) What is the range of random numbers that will lead to the selection of 01001? (0.139 to 1.0)

  9. 1st generation: 5-bit binary numbers chosen randomly 00100 = 4 01001 = 9 11011 = 27 fitness = 0.0023 11111 = 31 fitness = 0.0001 fitness = 0.0011 fitness = 0.0216 pi = 0.044 pi = 0.861 pi = 0.091 pi = 0.004 i f(xi) = 0.0251 \ \ \ |---------|---------|---------|---------|---------|---------|---------|---------|--- 0 .1 .2 .3 .4 .5 .6 .7 .8 Assume that random numbers selected the pair of chromosomes with greatest fitness (01001 and 11011).

  10. Crossover selected to induce change Assume a mixing point (locus) is chosen between first and second bit. Keep the unselected chromosomes of the 1st generation

  11. Evaluate fitness of new population 00100 = 4 01011 = 11 fitness = 0.0457 11001 = 25 fitness = 0.0079 11111 = 31 fitness = 0.0001 New total fitness = 0.0548 fitness = 0.0011 pi = 0.0201 pi = 0.8339 pi = 0.1442 pi = 0.0018 \ \ \ |---------|---------|---------|---------|---------|---------|---------|---------|--- 0 .1 .2 .3 .4 .5 .6 .7 .8 What is the range of random numbers that will lead to the selection of 11001? (0.0219 to 0.1661) What is the range of random numbers that will lead to the selection of 01001? (0.1661 to 1.0)

  12. 1st generation |\ \ \ |---------|---------|---------|---------|---------|---------|---------|---------|--- 0 .1 .2 .3 .4 .5 .6 .7 .8 What is the range of random numbers that will lead to the selection of 11011? (0.048 to 0.139) What is the range of random numbers that will lead to the selection of 01001? (0.139 to 1.0) 2nd generation: bias of initial guess persist \ \ \ |---------|---------|---------|---------|---------|---------|---------|---------|--- 0 .1 .2 .3 .4 .5 .6 .7 .8 What is the range of random numbers that will lead to the selection of 11001? (0.0219 to 0.1661) What is the range of random numbers that will lead to the selection of 01001? (0.1661 to 1.0)

  13. Crowding: In the initial chromosome population of this example 01001 has 86% of the selection probability. Potentially can lead to imbalance of fitness over diversity Limits the ability of GA to explore new regions of search space Solution: repeat whole process several times to ensure that solution is not biased by initial guess.

  14. Positional bias: Single-point crossover lets near-by loci stay together in children One of several methods to avoid positional bias

  15. Alternatives to binary representation in GA with real-valued variables Arithmetic crossover Parents <x1, x2, xn> and <y1, y2, yn> Choose kth gene at random Children <x1, x2, yk +(1- )xk, xn> <y1, y2, xk +(1- )yk, yn> 0 < <1 Discrete crossover: With uniform probability, each gene of child chromosome chosen to be a gene in one or the other parent chromosomes at the same locus. Parents <0.5, 1.0, 1.5, 2.0> and <0.2, 0.7, 0.2, 0.7> Child <0.2, 0.7, 1.5, 0.7>

  16. Feature Selection by Genetic algorithm and Information Gained Feature selection: Chose k<d important features, ignore the remaining d k Feature extraction: Project the original d attributes onto a new k<d dimensional feature space Principal components analysis (PCA), Linear discriminant analysis (LDA), Factor analysis (FA) Auto-association ANN 17

  17. Very Brief History of genetic algorithms: Genetic Algorithms were developed by John Holland in 60 s and 70 s Author of Adaption in Natural and Artificial Systems More recent book on the subject An Introduction to Genetic Algorithms by Melanie Mitchell (MIT Press, Cambridge, MA, 2002)

  18. Natural adaption: Populations of organisms are subjected to environmental stress. Fitness is manifest by ability to survive and reproduce Fitness is passed to offspring by genes that are organized on chromosomes. If environmental conditions change, evolution creates a new population with different characteristics that optimize fitness under the new conditions

  19. Basic tools of evolution Recombination (crossover) occurs during reproduction. Chromosomes of offspring are a mixture of chromosomes from parents Mutation changes a single gene within a chromosome. To be expressed, organism must survive, and pass modified chromosome to offspring

  20. Artificial adaptation continued: In 50-500 generations create a population of solutions with high fitness Repeat whole process several times and merge best solutions Simple example: Find the position of the maximum of a normal distribution with mean of 16 and standard deviation of 4

  21. Evaluate fitness of new population 00100 = 4 01011 = 11 fitness = 0.0457 11001 = 25 fitness = 0.0079 11111 = 31 fitness = 0.0001 fitness = 0.0011 pi = 0.0201 pi = 0.8339 pi = 0.1442 pi = 0.0018 i f(xi) = 0.0548 about 2 times that of the 1st generation Select 2 chromosomes for next generation. Repeat until fitness of population is almost uniform Values of all chromosomes should be near 16

  22. Sigma scaling allows variable selection pressure Sigma scaling of fitness f(x) ( ) f x = + ( ) 1 fss x and are the mean and standard deviation of fitness in the population In early generations, selection pressure should be low to enable wider coverage of search space (large ) In later generations selection pressure should be higher to encourage convergence to optimum solution (small )

  23. More methods for real-valued variables Discrete crossover: With uniform probability, each gene of child chromosome chosen to be a gene in one or the other parent chromosomes at the same locus. Parents <0.5, 1.0, 1.5, 2.0> and <0.2, 0.7, 0.2, 0.7> Child <0.2, 0.7, 1.5, 0.7> Normally distributed mutation: Choose random number from normal distribution with zero mean and standard deviation comparable to size of genes (e.g. = 1 for genes scaled between -1 and +1). Add to randomly chosen gene. Re-scale if needed.

  24. Example of genetic algorithm with genes that are real numbers: Optimizing weights of an artificial neural network (ANN)

  25. Using GA in training of ANN ANN with 11 weights: 8 to hidden layer, 3 to output w1A w1Bw2Aw2Bw3Aw3Bw0Aw0BwAZwBZw0Z Value of output node determines fitness

  26. Chromosome for weight optimization by GA < w1A w1Bw2Aw2Bw3Aw3Bw0Aw0BwAZwBZw0Z > Scaled to values between -1 and +1 Use methods crossover and mutation for real numbers to modify chromosome Fitness function: mean squared deviation between output and target

  27. Use feed forward to determine the fitness of this new chromosome

  28. Feature Selection by Genetic algorithm and Information Gained Feature selection: Chose k<d important features, ignore the remaining d k Feature extraction: Project the original d attributes onto a new k<d dimensional feature space Principal components analysis (PCA), Linear discriminant analysis (LDA), Factor analysis (FA) Auto-association ANN 30

  29. Very Brief History of genetic algorithms: Genetic Algorithms were developed by John Holland in 60 s and 70 s Author of Adaption in Natural and Artificial Systems More recent book on the subject An Introduction to Genetic Algorithms by Melanie Mitchell (MIT Press, Cambridge, MA, 2002)

  30. Natural adaption: Populations of organisms are subjected to environmental stress. Fitness is manifest by ability to survive and reproduce Fitness is passed to offspring by genes that are organized on chromosomes. If environmental conditions change, evolution creates a new population with different characteristics that optimize fitness under the new conditions

  31. Basic tools of evolution Recombination (crossover) occurs during reproduction. Chromosomes of offspring are a mixture of chromosomes from parents Mutation changes a single gene within a chromosome. To be expressed, organism must survive, and pass modified chromosome to offspring

  32. Artificial adaptation continued: In 50-500 generations create a population of solutions with high fitness Repeat whole process several times and merge best solutions Simple example: Find the position of the maximum of a normal distribution with mean of 16 and standard deviation of 4

  33. Evaluate fitness of new population 00100 = 4 01011 = 11 fitness = 0.0457 11001 = 25 fitness = 0.0079 11111 = 31 fitness = 0.0001 fitness = 0.0011 pi = 0.0201 pi = 0.8339 pi = 0.1442 pi = 0.0018 i f(xi) = 0.0548 about 2 times that of the 1st generation Select 2 chromosomes for next generation. Repeat until fitness of population is almost uniform Values of all chromosomes should be near 16

  34. Sigma scaling allows variable selection pressure Sigma scaling of fitness f(x) ( ) f x = + ( ) 1 fss x and are the mean and standard deviation of fitness in the population In early generations, selection pressure should be low to enable wider coverage of search space (large ) In later generations selection pressure should be higher to encourage convergence to optimum solution (small )

  35. More methods for real-valued variables Discrete crossover: With uniform probability, each gene of child chromosome chosen to be a gene in one or the other parent chromosomes at the same locus. Parents <0.5, 1.0, 1.5, 2.0> and <0.2, 0.7, 0.2, 0.7> Child <0.2, 0.7, 1.5, 0.7> Normally distributed mutation: Choose random number from normal distribution with zero mean and standard deviation comparable to size of genes (e.g. = 1 for genes scaled between -1 and +1). Add to randomly chosen gene. Re-scale if needed.

  36. Example of genetic algorithm with genes that are real numbers: Optimizing weights of an artificial neural network (ANN)

  37. Using GA in training of ANN ANN with 11 weights: 8 to hidden layer, 3 to output w1A w1Bw2Aw2Bw3Aw3Bw0Aw0BwAZwBZw0Z Value of output node determines fitness

  38. Chromosome for weight optimization by GA < w1A w1Bw2Aw2Bw3Aw3Bw0Aw0BwAZwBZw0Z > Scaled to values between -1 and +1 Use methods crossover and mutation for real numbers to modify chromosome Fitness function: mean squared deviation between output and target

  39. Use feed forward to determine the fitness of this new chromosome

Related


More Related Content