Understanding Singular Value Decomposition (SVD) in Linear Algebra

Slide Note
Embed
Share

Singular Value Decomposition (SVD) is a powerful technique in linear algebra that breaks down any matrix into orthogonal stretching followed by rotation. It reveals insights into transformations, basis vectors, eigenvalues, and eigenvectors, aiding in understanding linear transformations in a geometric sense. SVD can be performed on non-square matrices by dimension padding or reduction for converting between 2D and 3D spaces.


Uploaded on Aug 15, 2024 | 1 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Lecture 13: Singular Value Decomposition (SVD) Junghoo John Cho UCLA

  2. Summary: Two Worlds basis vectors World of vectors Vector ? Linear transformation ? ? ? + ? ? = ?? ? + ?? ? World of numbers 1?: 1,2,0 vector 1 2 1:1 mapping (=isomorphic) 3 0 2 2?: matrix Orthogonal Stretching Stretching factor Stretching direction Rotation Stretching + Rotation Symmetric matrix Eigenvalue Eigenvector Orthonormal matrix

  3. Singular Value Decomposition (SVD) Any matrix ? can be decomposed to ? ? = ?1??2 where ? is a diagonal matrix and ?1 and ?2 are orthonormal matrix Singular values: diagonal entries in ? Example 1 2 2 1 2 2 5 10 1 4 5 3 5 4 5 9 2 5 3 2 2 3 0 0 2 = 10 1 3 17 2 5 Q: What is this transformation? What does SVD mean?

  4. Singular Value Decomposition (SVD) ? mean? Q: What does ?2 1 1 4 5 3 5 4 5 3 0 0 2 Change of coordinates! New basis vectors are (4/5, 3/5) and (-3/5, 4/5)! 2 2 ?= ? = ?1??2 1 1 3 2 2 5 Q: What does ?1mean? Q: What does ? mean? Rotation! Rotate first basis vector (4/5, 3/5) to (1/ 2, 1/ 2) second basis vector (-3/5, 4/5) to (-1/ 2, 1/ 2) Orthogonal stretching! Stretch x3 along first basis vector (4/5, 3/5) Stretch x2 along second basis vector (-3/5, 4/5)! SVD shows that any matrix (= linear transformation) is essentially a orthogonal stretching followed by a rotation

  5. What about Non-Square Matrix ?? ?? Q: When ? is an ? ? matrix, what are dimensions of ?1,?,?2 ? ? ? ? ?) (?) = (?1) (?) (?2 ? ? ? ? For non-square matrix ?, ? becomes a non-square diagonal matrix When ? > ? When ? < ? ?1 0 0 ?2 0 0 ?1 0 0 ?2 0 0 ? = ? = dimension padding Covert 2D to 3D by adding a third dimension, for example dimension reduction Convert 3D to 2D by discarding the third dimension, for example

  6. Computing SVD Q: How can we perform SVD? ? = ?1??2 ??? = (?1??2 Q: What kind of matrix is ???? ??? is a symmetric matrix Orthogonal stretching Diagonal entries of ??? ~ ?2 are eigenvalues (i.e., stretching factor) Columns of ?2 are eigenvectors (i.e., stretching direction) We can compute ?2of ? = ?1??2 Similarly ?1 is the eigenvectors of ??? ? ~ ??? or ??? SVD can be done by computing eigenvalues and eigenvectors of TTT and TTT ? ?)??1??2 ?= ?2???1 ? ?= ?2????2 ? ?1??2 ? by computing eigenvectors of ???

  7. Example: SVD Q: What kind of linear transformation is ?? 3 1 1 1 3 5 4 5 0 4 2 2 0 0 2 2 1 5 0 0 0 3 0 0 0 1 2 2 1 5 3 = ? = 1 3 5 0 2 2 0 0 2 2 2 2 12 5 9 5 1 0 0 1 0

  8. Summary: Two Worlds basis vectors World of vectors Vector ? Linear transformation ? ? ? + ? ? = ?? ? + ?? ? World of numbers 1?: 1,2,0 vector 1 2 Symmetric matrix Eigenvalue Eigenvector Orthonormal matrix Singular value decomposition 1:1 mapping (=isomorphic) 3 0 2 2?: matrix Orthogonal Stretching Stretching factor Stretching direction Rotation Stretching + Rotation

  9. SVD: Application Rank-? approximation Sometimes we may want to approximate a large matrix as multiplication of two smaller matrices ? ? ? ? = X ? ? Q: Why?

  10. Rank-? Approximation Q: How can we decompose a matrix into multiplication of two matrices of rank-? in the best possible way? Minimize the L2 difference (= Frobenius norm) between the original matrix and the approximation

  11. SVD as Matrix Approximation 1/ 2 1/ 2 0 0 0 1 1/ 2 1/ 2 0 3/5 4/5 0 4/5 3/5 0 0 0 1 100 0 0 0 0 0 ? = 10 0 0.1 Q: If we want to reduce the rank of ? to 2, what will be a good choice? The best rank-? approximation of any matrix ? is to keep the first-? entries of its SVD. Minimizes L2 difference between the original and the rank-? approximation

  12. SVD Approximation Example: 1000 x 1000 matrix with (0 255) 62 60 58 57 61 60 58 57 61 59 58 57 59 59 58 57 58 58 58 57 57 58 58 57 56 57 58 57 56 57 58 57 59 58 57 56 58 58 57 57 57 57 57 57 56 57 57 58 58 57 57 57 56 56 55 55 55 56 57 58 57 57 56 56 55 55 54 54 56 56 57 57 55 55 55 55 55 55 55 55 56 56 56 56 53 53 54 54 55 56 56 56 57 56 56 55 55 55 55 56 56 56 56 56 59 58 57 56 54 54 55 55 55 55 56 56 57 57 56 56

  13. Image of original matrix 1000x1000

  14. SVD. Rank 1 approximation

  15. SVD. Rank 10 approximation

  16. SVD. Rank 100 approximation

  17. Original vs Rank 100 approximation Q: How many numbers do we keep for each?

  18. Dimensionality Reduction A data with large dimension Example: 1M users with 10M items. 1M x 10M matrix Q: Can we represent each user with much fewer dimensions, say 1000, without losing too much information?

Related