Matrix Factorization Techniques and Applications

matrix factorization l.w
1 / 8
Embed
Share

Explore the concept of matrix factorization in data analysis, understanding the common factors between otakus and characters, and optimizing recommendations through gradient descent. Learn about latent factors, minimizing errors, and topic analysis using matrix factorization models like LSA and PLSA.

  • Matrix Factorization
  • Recommender Systems
  • Gradient Descent
  • Latent Factors
  • Topic Analysis

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Matrix Factorization

  2. Otakus v.s. No. of Figures A 5 3 0 1 B 4 3 0 1 C D 1 1 1 0 0 4 5 4 E 0 1 5 4 There are some common factors behind otakus and characters. http://www.quuxlabs.com/blog/2010/09/matrix-factorization-a-simple-tutorial-and- implementation-in-python/

  3. The factors are latent. Otakus v.s. No. of Figures match A No one cares Not directly observable B C

  4. ?1 ?2 ?3 ?4 ?? ?? ?? ?? ?? A 5 3 0 1 B 4 3 0 1 Matrix X C 1 1 0 5 D 1 0 4 4 E 0 1 5 4 No. of Otakus = M No. of characters = N N No. of latent factor = K K N ?? ?1 5 ??1 ??2 ??1??2 rA r1 r2 K ?? ?1 4 M N rB ?? ?1 1 Singular value decomposition (SVD) Minimize Error Matrix X

  5. ?? ?1 ?2 ?3 ?4 ?? ?? ?? ?? ?? ?? ??1 A 5 3 ? 1 B 4 3 ? 1 C 1 1 ? 5 D 1 ? 4 4 E ? 1 5 4 Only considering the defined value Minimizing ?? ?1 5 ?? ?1 4 2 ?? ?? ??? ? = ?? ?1 1 ?,? Find ?? and ?? by gradient descent

  6. ?1 ?2 ?3 ?4 ?? ?? ?? ?? ?? -0.4 A 5 3 ? 1 -0.3 2.2 B 4 3 ? 1 C 1 1 ? 5 0.6 D 1 ? 4 4 0.1 E ? 1 5 4 Assume the dimensions of r are all 2 (there are two factors) A 0.2 2.1 1 ( 1 ( ) ) 0.0 2.2 B C D 0.2 1.3 1.9 1.8 0.7 0.2 2 ( 2 ( ) ) 3 ( 3 ( ) ) 0.1 1.5 1.9 -0.3 4 ( 4 ( ) ) 2.2 0.5 E 2.2 0.0

  7. More about Matrix Factorization Considering the induvial characteristics ?? ?1+ ??+ ?1 5 ?? ?1 5 ??: otakus A likes to buy figures ?1: how popular character 1 is 2 ?? ??+ ??+ ?? ??? ? = Minimizing ?,? Find ??, ??, ??, ?? by gradient descent (can add regularization) Ref: Matrix Factorization Techniques For Recommender Systems

  8. Matrix Factorization for Topic analysis character document, otakus word Latent semantic analysis (LSA) Number in Table: Doc 1 Doc 2 Doc 3 Doc 4 Term frequency (weighted by inverse document frequency) 5 3 0 1 4 0 0 1 1 1 0 5 Latent factors are topics ( ) 1 0 0 4 0 1 5 4 Probability latent semantic analysis (PLSA) Thomas Hofmann, Probabilistic Latent Semantic Indexing, SIGIR, 1999 latent Dirichlet allocation (LDA) David M. Blei, Andrew Y. Ng, Michael I. Jordan, Latent Dirichlet Allocation, Journal of Machine Learning Research, 2003

Related


More Related Content