Projection Matrices

Projection Matrices
Slide Note
Embed
Share

Learn the importance of a thesis statement, how to formulate a strong thesis, and examples from readings. Understand the role of a thesis in academic writing and find answers to common questions about crafting a thesis statement.

  • Thesis
  • Writing
  • Academic
  • Research
  • Statement

Uploaded on Feb 22, 2025 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Projection Matrices From: D.A. Harville, Matrix Algebra from a Statistician s Perspective, Springer. Chapter 12

  2. Introduction - I Matrix Y in a linear space of matrices that is orthogonal to every matrix in subspace U Y is orthogonal to U If every matrix in subspace U is orthogonal to every matrix in subspace W, then U is orthogonal to W Y Lemma 12.1.1. ,..., X matrix in linear space , and , are subspaces of spans ,..., spans then: t = = = = X Z y X Z V W U W V Y trace 0 = X Z Z U 1 1 s ( ) ( ) = = Y X Y X YX ' trace ' 0 1,..., i s U i i i 1,..., ; 1,..., U W i t j s i j Corollary 12.1. 2. , , Then: m n m p 1 m ( ) ) 0 ( ) = = y X X y 0 y X 0 is orthogonal (wrt inner product) to C C ' ' ' C ( ) X ( ) Z ( = = X Z 0 Z X ' '

  3. Introduction - II ( ) . Then unique V = = = Y Z Y Z Y Z Y Theorem 12.1.3. = -dimensional = Y Z s.t. Note: r V U U U U = = + + Z 0 Z X X X X Y X 0 0 ... where: ,..., orthonormal basis of 1,..., r r c c c j r U 1 = 1 1 r r r j j ( ) and let U = = = Z r 0 0 Y Z Y Z Y Proof: 0 ' and r U U = + + X X Y X X X 0: ,..., orthonormal basis of 1,..., ... r c j r c c U 1 1 1 j j r r r r = = = = Y X X Y X X X Y X Y X 0 1,..., c c c i r c U j j i i i i i i i j j = = 1 1 j j r r r ( ) = X Y X X X X X X X For any and then c c c U U U j j j j j j = = = 1 1 1 j j j r r r r ( ) = 0 0 = = Y X X X Y X X X X X 0 c c c c j j j j j j j j = = = = 1 1 1 1 j j j j r r ( ) = = X X Unique in s.t. Z Y Z Z X c c U U j j j j = = 1 1 j Y j = = = Y Z Y Y Y 0 Y Z conversely: If U U U ( ) Y Z Y Z Y Y Z Implication: Z unique (aka s.t. ) of on . If Y and Y forms 90 angle with all nonnull matrices in it is the projection o f itself on U U V V U U U : U orthogonal projection projection U U

  4. Introduction - III ( ) X n k = Y X Example: matrix in dimensional subspace formed by the column space of with rank p p R V. U n k n p ( ) ( ) 1 1 p k = = ( = XB B Z X X X X Y XB B X X ) X Y Any matrix in can be written as U ( for some ) matrix : ( ' ' with ' ' ) ( ) ( ) ( ) ( ) ( ) 1 1 1 1 = = = = = * * * * * Y Z XB I X X X X Y XB Y I X X X X XB Y I X X X X XB B X X X X X X ' ' ' ' ' ' ' ' ' ' ' 0 since ' ' Y Y Z Z Y Y Theorem 12.1.4. ,..., ,..., projections of ,..., on . V U V U 1 1 1 p p p p p Y Z Then for any scalars: ,..., :the projection of on is k k k k U 1 p i i i i = = 1 1 i i p = Z Y Z Z Proof: 1,..., (linear space). U i p k U U i i i i i = 1 i p p p p ( ) ( ) ( ) = = = 0 ... 0 = + + = X Y Z X Y Z X Y Z X Y Z X : 0 0 k k k k U i i i i i i i i i i i i = = = = 1 1 1 1 i i i i p p p p Y Z Z Y is the projection of on k k k k U U i i i i i i i i = = = = 1 1 1 1 i i i i

  5. Projection of a Column Vector - I n z y X Theorem 12.2.1. projection of on subspace with with columns that span U R U n p = 1 n 1 n = * z Xb b ( X Xb = ' in X y ' is consistent) A B b Then Proof: Let for any solution to consistent linear system ' be a solution to ' = X Xb ) then it is the projection of on and U * X y A AX ' (Th. 7.4.1.: ' = C ( ) ( ) X = * * X y Xb 0 y Xb ' U = * * Xb y z Xb With U x x x x x y ' ' ' b 1 1 1 1 1 p = = = = X x x X X b X Y Normal Equations: ' ' 1 p n p x x x x x y ' ' ' b 1 p p p p p p ( ) + + = = = = x x x x x x x y b X X X Y ' ... ' ' ' 1,..., In general solutions of form: p ' ' 1 1 b b i i b j j j p p j j = 1 i = = X X X I b X Y Columns of orthonormal basis of ' ' U ( ) = n z y X z X X X X Y Corollary 12.2.2. pro jection of on subspace with with columns that span Then U ' ' U R n p 1 n 1 n

  6. Projection of a Column Vector - II ( ) ( ) X = y X W W Corollary 12.2.3. , , s.t. C C n p n q 1 n = = = * * * * Wa Xb a b W Wa ' and ' W y X Xb X y then for any solutions , to ' ' = = y X Xb Xb b b X Xb X y Corollary 12.2.4. , , then for any 2 solutions , to ' ' 1 2 1 2 n p 1 n n z y X Theorem 12. 2.5. projection of on subspace with with columns that span U R U n p 1 n 1 n = = z * * b z Xb X Xb X y Xb Then any Proof: ' s.t. = X y is a solution to ' ' has a solution, say ' = = = = * X Xb b X Xb X z X Xb X y ' ' ' ' 0 0 0 6 4 1 1 ( ) x = = y x Exam ple in 2 dimensions: span (line of 45 angle through origin) U 5 5 1 ( ) = = = = = = = z = x x x y x xb x y b z Xb y y z x ' 2 ' 10 ' ' 5 ' 0 1

  7. Example Maya Moore Points (Y) and Minutes (X) in n=2 WNBA Games 27 33 36 33 y y x x 1 1 = = = = = 36(36) 33(33) + = = 36(27) 33(33) + = y x x x x y ' 2385 ' 2061 2 2 0.5434 0.4981 0.4566 0.4981 2061 2385 ( ) ( ) 1 1 = = = = = b x x x y P x x x x ' ' 0.8642 ' ' 36 33 0.5434 0.49 0.4981 27 33 31.1094 28.5170 ^ y = = = = = = z xb Py .8642 81 0.4566 27 33 31.1094 28.5170 1 0.5434 0.4981 0.4981 1 0.4566 27 33 4.1094 4.4830 ^ y = = = z = = = e y y I P y ( )

  8. Example Maya Moore Points (Y) and Minutes (X) in n=2 WNBA Games Projection for Regression Through Origin, n=2 35 Y=(27,33)' 30 e=(-4.11,4.48)' 25 Y-hat= (31.11,28.52)' 20 Y2 15 10 5 0 0 5 10 15 20 25 30 35 Y1

  9. Example 4-dimensions + + + + 1 2 a a a a a a a a 1 1 1 1 1 0 1 0 0 0 1 1 16 18 7 9 1 2 ( ) X ( ) 1 2 = = = = = = X x x x y x x x Xa rank 2 span , , for any scalars , , a a a U 1 2 3 1 2 3 1 2 3 1 3 1 3 1 2 0 0 0 1 2 0 4 2 2 2 2 0 2 0 2 50 34 16 4 2 2 2 2 0 2 0 2 50 34 16 b b b 1 1 2 ( ) ( ) = = = = = = X X X y X Xb X y X X X X ' ' ' ' ' 1 0 ' 0 0 2 1 2 0 0 0 1 2 3 0 0 1 2 1 2 0 17 17 8 8 17 17 8 8 0 0 1 2 0 50 34 16 8 9 0 50 34 16 0 1 2 = = = = = = = = b z Xb b z Xb 1 0 0 0 17 8 1 1 1 2 2 2 0 0 0 1 2 0 0 1 1 1 0 1 1 0 1 0 1 1 0 1 0 0 0 1 ( ) = = X y z ' 1 1

  10. Projection Matrices - I n Theorem 12.3.1. Let Then a unique any subspace of such that Ay of all -dimensional column vectors. is the projection of on y n U R n A y U R n n ( ) ( ) X = = = = A P X X X X X ' ' s.t. C U X ( ) X n Proof: Let be any matr X y P y projection of on (Cor.12.2.2.) y ix s.t. . Then , C U R U X = = n If is such that A Ay y y Ay ( P y ) y A P y I is the projection of on then (let columns of ) U R = X X n = matrix and it is A A P X X X X there is a unique projection ' ' X A orthogonal projection matrix for projection matrix is referred to the or simply the U, ( ) X projection matrix for subspace of = = n A A P X Corollary 12.3.2. . Then for any s.t. U R C U X projection matrix iff = A A P X Corollary 12.3.3. for some matrix X

  11. Projection Matrices II X Theorem 12.3.4. Let be any matrix, then: n p ( ) ( ) = = P X X X X X X X X X X X (1) ' ' ' ' g-inverse of X ( ) ( ) = = = = = = = = = AB AC A AB A AC A X B X X X X C I A AB X X X X X X X X X XI A AC Cor. 5.3.3.: ' ' ' ' ' ' ' ' ' ' ' = ( = * * P XB B ) ( X XB ) ) X B (2) for any solution ) ' + X I to ' ' in X ( ) ( ) ( ( ) ( ) ( ) = = + = + * * B X X X X X X Y Y XB X X X X X X X X X X Y P 0 ' ' ' for some ' ' ' ' (from (1)) X ( ( ) ) ( ) ( ) ( ) ( ) ( ( ) = = = = = = P P P X X X X X X X X X X X X X X X X P (3) ' ' ' ' ' ' ' ' ' ' ' ' ' X X X X ( ( ) ( ) ( ) = = = AA A A AA A A A A A A Making use of: ' ' ' ' ' ' ( ) ) ( ) ( ) = X X X X X X X X X nverse of (from (3) and (1)) X (4) ' ' ' ' ' ' g-i ( ) ( ) ( ) ( ) ( ) ( ) ( ) = = = = = X P X P P X X X X X X X X X X X X X X X X X X X g-inverses of ' (from (3) and (1)) X (5) ' ' ' ' ' ' ' ' ' ' ' ' ' ' , ' ' X X X ( X X X ( ) ( ) ( ) ( ) ) ( ) = = ) ' = = P P X X X ( ) = C X X X X ( ( ( ) (from (7)) = I X ( X X X X X X X X X X X X X X (6) ' ' ' ' ' ' ' ' ' ' ' ' (from 1) X X ( ) ) = P X P (7) C R R X X ) ( ) X ( ) ( ) X ( ) X ( ) ( ) X ( ) ( ) ( ) ( ) = = = = = = X P X P X X X X P P P X P P ' ' ' ' C C C C C C C R R R X X X X X X X ( P ) = P X (8) rank rank P X ( )( I ) ( ) ( )( ( P ) ( ) = = = I + = I + = I = = I I I P I P I P I P = P ( ) P P P P P P P I P I P (9) ' ' ' ' X X X X X X n X X X X X X X X X X X ( ) ( ) ( ) I ) = P P (10) rank trace trace trace rank (from (8),(9), Lemma 10.2.4.) X X X

  12. Projection Matrices - III ( ) ( ) X = = = X W W P W W W P W P P P Theorem 12.3.5. , s.t. Then: 1) and ' ' 2) C C X X X W W n p n q ( ) ( ) X ( ) = = = = = W W XF F P W P XF P X F XF W Proof: (1) : for some matrix C C X X X ( ) ( )( ) ( ) = = = = P P P W W W W P W W W W W W W W P (2) ' ' ' ' ' ' X W X X W ( ) ( ) X = = W P P P If then (from Th. 12.3.5. (2) and Th. 12.3.1.) C C X W X ( ) ( ) X = = X W W P P Corollary 12.3.6. , s.t. then: C C W X n p n q Corollary 12.3.7. ( ) X = = X A P A AP A then for any projection matrix for some subspace of : U C X X n p ( ) ( ) X = = = W A P W P A AP A Proof: Let be a matrix whose columns span , then by Cor.12.3.2.: and since : U C C W X X n A X Theorem 12.3.8. Suppose = X A P projection matrix for some subspace of ( ) ( ) ( ) = = X X P A C C and that the columns of matrix span U R U ( ) U ( ) A = = dim rank U C Theorem 12.3.9. A matrix is a projection matrix iff it is symmetric and idempotent. Proof: From Cor.12.3.3. and Th.12.3.4. know that projection matrices are symmetric and idempot Need to show all symmetric, idempotent matrices are projections. Suppose ent. A symmetric, idempotent: ( ) = = = = A A AA A Any g-inverse of ' is a g-inverse of A A A A P A A A A A ' ' ' projection matrix A ( ) A A symmetric, idempotent is a projection matrix for C

  13. Least Squares ( ) unique Y Z that is the projection of on with U Y Y Z matrix in linear space with subspace of V U V U U ( ) Z ( Y W Y ) ) ( + Z W W Y Theorem 12.4.1. For distance ( ) ( Z uniquely minimized by taking ( ) ) ( ) ( ) ( = Y Z Y Z ) ( ( trace ' Z Z Y to be projection of on V U V U U ) 2 = = = = Y Z Y Y Z A B A B AB A A A and where: trace ' trace ' ( ) ( ) ( ( ) ( ) ( ) 2 2 = = Y W Y Z Z W Y Y Z W Z W Y Z Z W W Z Z W Proof: 2 where: , U U ( ) ( ) ( ( ) ) ) ( + ) ( ( ) 2 2 2 2 = = + = Y Z Z W ) ( Y Z Y W ( Z ) W Z W ) Z Y Z ( Z ) W Y Z W Z 0 since ) = Z (equal only if ) U ) ( ( ) 2 = = = = Y Z Y Z Y Y Y Z Z Y Z Y Y Z Y Y Y Y Z 0 n ( ) ( ' ) ( ) ( ) ( ' ) 2 = n y w y w y w y w y w Let and with goal of choosing that minimizes sum of squares: or its square root: y w U R U i i 1 n 1 n = 1 i ( ) X ( ) ( ' ) = = * * n n X y w y w y w Xb b X Xb X y Theorem 12 .4.2. Let: s.t. for : is minimized by where solution to ' ' U R C U R U n p 1 n = 1 n ( ) ( ) ( ) ( ) ( ' ) ( = ) ( ' ) ( ) ( ' ) ( ) = = = = * b X X ' for some X y X X w X X X X y P y y w y w y P y y P y y I P I P y y I P y ' ' ' ' min ' ' X X X X X X w ( ) X U = = w w Xc c : for some vector C U 2 P p n ( ) ( ' ) = = * p X y b y Xb y Xb b X Xb X y Theorem 12.4.3. , , is minimized at : a solution to normal equations ' ' y x b R i ij j n p 1 p 1 n = = 1 1 i j ( ) ( ' ) ( ) ( ) = = = * * * * Xb P y y Xb y Xb y y Xb y I y and ' ' X X

  14. Orthogonal Complements - I ( ) orthogonal complement of Linear space with subspace . Set of all matrices in orthogonal to is the V U . V U U U U V U V = B B A A B Lemma 12.5.1. ,..., spans is in 1,..., j k U V U V U 1 k j ( ) X ( ) X ( ) ( ) X R ( ) X = n X X Z X Z 0 : orthogon al complement of relative to (n-dim column vectors) set of all solutions to ' C C R C n p p orthogonal complement of relative to (p-dim row vectors) R R ( ) ( ) X ( ) ( ) ( ) ( ) ( ) ( ) A = = = = = X X I P X X X X X X P I A A Lemma 12.5.2. For any : ' since ' ' ' ' and C N C N C X X n p ( ) X ( ) X ( ) X = = X Corollary 12.5.3. For any : dim rank dim n n C C n p ( ) = A Theorem 12.5.4. : is an element of iff it is orthogonal to every matrix i U n U V V U U U ( U ) ( ) A B A A B A B Proof of : Suppose is orthogonal to every matrix in = A B A B A A B . Let be projection of on = A B 0 U U U U ( ) ( ) ( ) ( ) 0 0 = = = B A B A B A 0 ( ) Corollary 12.5.5. is contained in W V U V W U W U = = Corollary 12.5.6. , 1) 2) U W V W U U W W U U W

  15. Orthogonal Complements - II ( ) X ( ) X Z where is any matrix with columns spanning Z X Corollary 12.5.7. , and ' C N n p n s ( ) ( ) Y ( ) ( ) Z = = Y Y X Z Y 0 Y Z 0 Then for any , ' or ' C C n q 0 ( ) Y ( ) X ( ) Y ( ) X = Z Y Proof: ' C C C C C C ( ) = = Y 0 projection of on since Y ' Z Y 0 Z Z Z Z Y Z of ' ' U U V U U m n = Y Z Y Y Y Y Z Y Theorem 12.5.8. Proof: By definition Need to show that projection of on then Z Z U projection of on V U V U U Y Z Z Z U U ( ) ( ) = Y Z Y Y Z U U ( ) X = * X z y b X Xb X y Corollaries 12. 5.9-10. , projection of column vector on and a solution to ' ' C n p ( ) ( ) ( ) ( ) X = = = = * z y Xb y X X X X y y P y I P y I P Then ' ' projection matrix for C X X X then unique projection of on W = + Y Z W Y Z W Theorem 12.5.11. projection of on and Z and U s.t. V U V U Y Y U U

  16. Example Maya Moore Points (Y) and Minutes (X) in n=2 WNBA Games 27 33 36 33 y y x x ( ) x ( ) x 1 1 = = = = = = y x x span for scalar c c U C 2 2 0.5434 0.4981 0.4566 0.4981 0.4566 0.4981 0.5434 0.4981 ( ) 1 = = = P x x x x I P ' ' x x 36 33 0.5434 0.4981 0.456 0.4981 27 33 31.1094 28.5170 ^ y = = = = = = z xb P y .8642 x 6 27 33 31.1094 28.5170 0.4566 0.4981 0.5434 0.4981 27 33 4.1094 4.4830 ^ y = = = z = = = e y y I P y ( ) x

  17. Maya Moore Regression Through Origin (X=Minutes, Y=Points) 80 60 40 20 0 -60 -40 -20 0 20 40 60 -20 -40 -60 -80 z w y

  18. Example 4-dimensions + + + + a a a a a a a a 1 1 1 1 1 0 1 0 0 0 1 1 16 18 7 9 1 2 ( ) X ( ) 1 2 = = = = = = X x x x y x x x Xa rank 2 span , , for any scalars , , a a a U 1 2 3 1 2 3 1 2 3 1 3 1 3 1 2 1 2 1 2 1 2 1 2 1 2 0 0 0 0 0 0 1 2 0 1 2 1 2 4 2 2 2 2 0 2 0 2 50 34 16 0 0 0 0 ( ) = = = = = X X X y X X P I P ' ' ' 0 0 X X 1 2 1 2 1 2 1 2 1 2 1 2 2 0 0 0 0 1 2 0 0 1 2 1 2 0 0 0 0 1 2 1 2 1 2 1 2 1 2 1 2 0 0 0 0 16 18 7 9 17 17 8 8 16 18 7 9 1 1 2 1 2 0 0 0 0 1 ( ) = = = = = = z P y w I P y X X 1 2 1 2 1 2 1 2 1 2 1 2 1 0 0 0 0 1 1 2 1 2 0 0 0 0

  19. Dimension of Orthogonal Complement ( ) X ( ) X ( ) X U = = dim rank dim n n C C = Let: dimensional subspace of dimensional dim r n k V U M T A = = A A B B ,..., orthonormal basis of ,..., orthonormal basis of M T U U 1 , 1 r k = V B = A B B ,..., , ,..., S 1 be written as 1 r k + = Y Z W 0 Z W Every A can U with = and 1,..., = spans orthonormal basis of S U U V = + A B ' 1,..., ; i r j k S n r k U V i j i j V U V U = + = Theorem 12.5.12. dim dim dim dim dim dim U V U U ( ) ) ( ) ( ) ( ) X = = 2 1 1 = = X Examples: Maya Moore: 2 r ank 1 rank n C C ( ) ( ( ) ( ) X = = 4 2 = = X 4-dimension case: 4 rank 2 rank 2 n C C

More Related Content