Introduction to Independent Component Analysis in Math

Download Presenatation
introduction to independent component analysis n.w
1 / 23
Embed
Share

"Explore the concept of Independent Component Analysis (ICA) in this informative project presentation. Learn about the Cocktail Party Problem, ICA model, Fast ICA algorithm, and more. Discover the motivation behind ICA and the process of estimating original speech signals. Understand the principles and applications of ICA in separating mixed audio signals. Gain insights into the significance of statistically independent and non-Gaussian components in ICA. Dive into the methods of finding underlying factors from multivariate data through ICA. Uncover the assumptions, models, and reasons for using non-Gaussian distributions in ICA. Witness the importance of orthogonal mixing matrices in achieving effective results with ICA."

  • ICA
  • Independent Component Analysis
  • Cocktail Party Problem
  • Fast ICA
  • Estimation

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Introduction to Independent Component Analysis Math 285 project Fall 2015 Jingmei Lu Xixi Lu 12/10/2015

  2. Agenda The Cocktail Party Problem ICA model Principle of ICA Fast ICA algorithm Separate mixed audio signal Reference

  3. The Cocktail Party Problem x1 s1 Observations Sources x2 s2 Purpose: estimate the two original speech signals s1(t) and s2(t), using only the recorded signals x1(t) and x2(t)

  4. Motivation 1 4 0.5 2 0 0 -0.5 -2 -1 -4 0 100 200 300 400 500 600 700 800 900 1000 0 100 200 300 400 500 600 700 800 900 1000 1 10 0.5 5 0 0 -0.5 -5 -1 -10 0 100 200 300 400 500 600 700 800 900 1000 0 100 200 300 400 500 600 700 800 900 1000 is ix Independent Sources Mixture signal

  5. Motivation 1 2 0.5 1 0 0 -0.5 -1 -1 -2 0 100 200 300 400 500 600 700 800 900 1000 0 100 200 300 400 500 600 700 800 900 1000 1 2 0.5 1 0 0 -0.5 -1 -1 -2 0 100 200 300 400 500 600 700 800 900 1000 0 100 200 300 400 500 600 700 800 900 1000 is Recovered signals Independent Sources

  6. What is ICA? Independent component analysis (ICA) is a method for finding underlying factors or components from multivariate (multi-dimensional) statistical data. What distinguishes ICA from other methods is that it looks for components that are both statistically independent, a n d n o n G a u s s i a n . A.Hyvarinen, A.Karhunen, E.Oja Independent Component Analysis

  7. ICA Model Observe n linear mixtures x1, xn of n independent components xj= aj1s1+ aj2s2+ .. + ajnsn, for all j xj: observed random variable sj: independent source variable ICA model: x = As aijis the entry of A Task: estimate A and s using only the observeable random vector x

  8. ICA Model Two assumptions: The components si are statistically independent 2. The independent components must have nongaussian distributions. 1.

  9. Why non-Gaussian Assume : 1) s1 and s2 are gaussian 2) mixing matrix A is orthogonal Then x1 and x2 are gaussian, uncorrelated, and of unit variance. Their joint density is

  10. Why non-Gaussian Since the density is completely symmetric, it does not contain any information on the direction of the columns of the mixing matrix A.

  11. Why non-Gaussian Assume s1 and s2 follow uniform distribution with zero mean and unit variance 1 | | 3 if s i = 2 3 ( ) p s i 0 otherwise 2 3 Mixing matrix A is = A 2 1 x=As The edges of the parallelogram are in the direction of the columns of A

  12. Principle of ICA = T y w x = = T T y w As z s = x As y is a linear combination of si, with weights given by zi Central Limit Theorem: the distribution of a sum of independent random variables tends toward a guassian distribution, under certain condition. zTs is more gaussian than either of si. And becomes least gaussian when its equal to one of si. So we could take w as a vector which maximizes the non-gaussianity of wTx.

  13. Measure of Nongaussianity Entropy (H): degree of information that an observation gives i = = = = = = H Y ( P Y ( a P Y ( a ) ) log ) i i A Gaussian variable has the largest entropy among all random variables of equal variance Negentropy J ( ( ) ) ( ( ) ) y H = = gauss J Y ( H y ) Computationally difficult

  14. Negentropy approximations In fastICA algorithm, use ) ( y J 2 ( ) ( ) E G y E G v G is some nonquadratic function. v is a Gaussian variable of zero mean and unit variance. Maximize J(y) to maximize nongaussianity.

  15. Fast ICA Data Preprocessing Centering Whitening Fast ICA algorithm Maximize non gaussianity

  16. Data Preprocessing

  17. Fast ICA Algorithm 1. Choose an initial weight vector w. 2. Let w+ = E{xg(wTx)} E{g (wTx)}w g() is the derivatives of functions G 3. w = w+/||w+||. (Normalization step) 4. If not converged go back to 2 converged if norm(wnew wold) < typically around 0.0001

  18. Separate mixed audio signal

  19. Mixed signals 0.5 0 -0.5 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 4 x 10 1 0 -1 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 4 x 10 1 0 -1 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 4 x 10 0.5 0 -0.5 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 4 x 10

  20. Separated signals 10 0 -10 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 4 x 10 5 0 -5 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 4 x 10 20 0 -20 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 4 x 10 10 0 -10 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 4 x 10

  21. Separated signals by PCA 1 0 -1 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 4 x 10 1 0 -1 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 4 x 10 0.5 0 -0.5 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 4 x 10 1 0 -1 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 4 x 10

  22. Other applications Separation of Artifacts in MEG Data Finding Hidden Factors in Financial Data Reducing Noise in Natural Images Telecommunications

  23. Reference Hyv rinen, A., Karhunen, J., Oja, E.: 2001, Independent Component Analysis: Algorithms and Applications, Wiley, New York. S rel . "COCKTAIL PARTY PROBLEM." COCKTAIL PARTY PROBLEM. N.p., 20 Apr. 2005. Web. Dec.-Jan. 2015. http://research.ics.aalto.fi/ica/cocktail/cocktail_en.cgi

More Related Content