Understanding Generalized Discriminant Analysis (GDA) in Pattern Recognition
Generalized Discriminant Analysis (GDA) is a nonlinear form of Linear Discriminant Analysis (LDA) that utilizes kernel methods to find discriminatory features for optimal class separability. LDA aims to maximize the between-class covariance matrix while minimizing the within-class covariance matrix. By seeking a transformation that maximizes a specific quotient, GDA can be implemented efficiently even in high-dimensional spaces. The Kernel Method is introduced to simplify the process by achieving linear separability in a higher-dimensional space, making tasks computationally less demanding.
- Pattern Recognition
- Generalized Discriminant Analysis
- Linear Discriminant Analysis
- Kernel Method
- Feature Separability
Download Presentation
Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
E N D
Presentation Transcript
Generalized Discriminant Analysis (GDA) Abdelrahman Taha CS 479/679 Pattern Recognition Computer Science & Engineering Department University of Nevada, Reno May 1, 2023
Definition A nonlinear form of Linear discriminant analysis (LDA) based on the kernel method.
Linear Discriminant Analysis (LDA) LDA seeks to find discriminatory features that provide the best class separability The discriminatory features are obtained by maximizing the between-class covariance matrix and minimizing the within-class covariance matrix.
Linear Discriminant Analysis (LDA) Let X be the input vector C be the number of classes ??be the number of samples in each class ? = 1,2, ,? ?be the total number of samples ?? be the mean of the ?? class ? be the mean of the whole dataset ?? be the within-class covariance matrix ?? be the between-class covariance matrix
Linear Discriminant Analysis (LDA) Assume the desired transformation ? is ? = ??? LDA seeks a transformation ? that maximizes the following quotient ????? ?? It can be shown that the columns of ? corresponds to the eigenvectors obtained by solving the following eigen-problem Since the maximum rank of ?? is (? 1), the maximum dimensionality of LDA is (? 1).
The Kernel Method In Generalized/nonlinear discriminants, instead of developing a nonlinear decision boundary in the original feature space, we develop an equivalent linear decision boundary in a high-dimensional space where the classes are linearly separable. This mapping, however, can be computationally demanding. Using kernels can help circumvent the problem of explicitly calculating
The Kernel Method Suppose ?,? ?2 ?1 ?2 ?1 ?2,? = ? = Consider the following ?2 ?3: 2 2 ?1 ?1 ? ? ? = ,? ? ? = 2 ?1?2 ?2 2 ?1?2 ?2 2 2 2?1 2+ 2?1?1?2?1+ ?2 2?2 2= (?1?1+ ?2?2)2 ? ? .? ? = ?1 This mapping can be alternatively done using a kernel function ? ?,? = (?.?)? ? ?,? =? ? .? ?
Generalized Discriminant Analysis (GDA) Let x be a vector of the input set ?with Msamples ??represents a subset of ? ? ??, where Nis the number of the classes. ? = ?=1 Let nlbe the number of samples in class? The within-class covariance matrix can be computed as follows Baudat G, Anouar F. Generalized discriminant analysis using a kernel approach. Neural computation. 2000 Oct 1;12(10):2385-404.
Generalized Discriminant Analysis (GDA) Suppose that the original space ? is mapped into a higher-dimensional space ? The within-class covariance matrix ? in the space ? can be written as follows The between-class covariance matrix ?can be written as where is the mean value of class ?
Generalized Discriminant Analysis (GDA) To generalize LDA to the nonlinear case, we use the following kernel function Then, we introduce the matrix where is a (nl nl) matrix with terms all equal to 1nl The GDA method is then formulated in the ? space using: ?, ?, ?, and ?
Generalized Discriminant Analysis (GDA) As for the LDA, GDA seeks to maximize ? and minimize ? by solving the following eigen-problem (1) Multiplying both sides of this equation by ?? (2) Since the eigenvectors are linear combinations of the transformed samples ? = ? Equation (2) can be shown to be equivalent to the following quotient: (3)
Generalized Discriminant Analysis (GDA) Carry out eigen-decompositon of ? (4) Substituting Equation (4) into (3) yields: (5) ? To simplify Eq. (5), let s introduce a new variable (6) Eq. (6) can be further simplified as follows: (7) Eq. (7) is solved for ? by maximizing ?
Generalized Discriminant Analysis (GDA) For a given ?, there exists at least one ? that satisfies The GDA procedure can be summarized as follows: 1. Compute the ? and ? matrices. 2. Decompose ? using eigen-decomposition. 3. Compute ? and ?. 4. Compute ? using ?. 5. Compute the projections of the test points onto the eigenvectors ?. The maximum dimensionality of GDA is (? 1).