Fundamental Concepts in Vector Spaces and Inner Product Spaces

Slide Note
Embed
Share

A vector space over a field F is characterized by operations such as addition and scalar multiplication. Subspaces, direct sums, linear combinations, linear spans, dimensions, and dual spaces are fundamental concepts in vector spaces. Moving into inner product spaces, the concept of inner products, norms, orthogonality, orthogonal complements, orthonormal sets, and key inequalities like the Schwarz inequality are explored. This comprehensive guide delves deep into the essential notions in both vector spaces and inner product spaces.


Uploaded on Sep 21, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. UNIT I VECTOR SPACE an operation which we denote by +, andif for every F, v V there is defined an element, written v, in V subject to 1. ( v + w) = v + w; 2. ( + )v = v + v; 3. ( v) = ( )v; 4.1v=v for all , F, v, w V (where the 1 represents the unit element of F under multiplication). A nonempty set V is said to be a vector space over a field F, if V is an abelian group under SUBSPACE of V, W, itself, forms a vector space over F. Equivalently, W is a subspace of V whenever w1 , w2 W, , F implies that w1+ w2 W. If V is a vector space over F and if W V, then W is a subspace of V if under the operations HOMOMORPHISM homomorphism if 1.(u1+ u2) T = u1T + u2T; 2.( u1) T = (u1T); for all u1, u2 U and all F. If U and V are vector spaces over F then the mapping T of U into V is said to be a

  2. INTERNAL DIRECT SUM Let V be a vector space over F and let U1, ... , Unbe subspaces of V. V is said to be the internal direct sum of u1, ... ,unif every element v V can be written in one and only one way as v = u 1 + u2 + + un where ui Ui. LINEAR COMBINATION If V is a vector space over F and if v1, , vn V then any element of the form 1v1+ 2v2+ + nvnwhere the i F, is a linear combination over F of v1, , vn . LINEAR SPAN all linear combinations of finite sets of elements of S. If S is a nonempty subset of the vector space V, then L(S), the linear span of S, is the set of LINEARLY DEPENDENT If V is a vector space and if v1, , vnare in V, we say thatthey are linearly dependent over F ifthere exist elements 1, , nin F, not all of them 0, such that 1v1 + 2v2 + + nvn = 0. BASIS elements (that is, any finite number of elements in S is linearly independent) and V = L(S). A subset S of a vector space V is called a basis of V if S consists of linearly independent DIMENSION fact, n is the number of elements in any basis of V over F.The integer n is called the dimension of V over F. If V is finite-dimensional over F then V is isomorphic to F(n) for a unique integer n;in

  3. LEMMA dim W dim V and dim V/W = dim V - dim W. If V is finite-dimensionaland if W is a subspace of V, then W is finite-dimensional, COROLLARY finite-dimensional and dim (A + B) = dim (A) + dim (B) - dim (A B). If A and B are finite-dimensional subspaces of a vector space V, then A + B is DUAL SPACE on V into F. If V is a vector space then its dual space is Hom (V, F). The notation for the dual space of V. An element of will be called a linear functional ANNIHILATOR If W is a subspace of V then the annihilator of W, A(W) = {f /f(w) = 0 all w W}.

  4. UNIT II INNER PRODUCT SPACE The vector space V over F is said to be an inner product space if there is defined for any two vectors u, v V an element (u, v) in F such that 1. (u, v) = ; 2. (u, u) 0 and (u, u) = 0 if and only if u = 0; 3. ( u + v, w) = (u, w) + (v, w); for any u, v, w V and , F. ( ) v, u LENGTH of v ( ) v If v V then the length of v (or norm of v), written llvll, is defined by llvll = . v, COROLLARY || u|| = | | ||u||. SCHWARZ INEQUALITY If u,v V then |(u, v)| ||u|| ||v||. ORTHOGONAL If u, v V then u is said to be orthogonal to v if (u, v) = 0. ORTHOGONAL COMPLEMENT If W is a subspace of V, the orthogonal complement of W, W , is defined by W = {x V/(x, w) = 0 for all w W}.

  5. NOTE W W = (0) ORTHONORMAL The set of vectors {vi}in V is an orthonormal set if 1. Each viis of length 1 (i.e.,(vi , vi) = 1). 2. For i j, (vi , vj) = 0. GRAM-SCHMIDT ORTHOGONALIZATION PROCESS Let V be a finite-dimensional inner product space; then V has an orthonormal set as a basis. COROLLARY If V is a finite-dimensional inner product space and W is a subspace of V then(W ) = W. R-MODULE an abelian group under an operation + such that for every r R and m M there exists an element rm in M subject to 1. r(a + b) = ra + rb; 2. r(sa) = (rs)a; 3. (r + s)a = ra + sa for all a, b M and r, s R. Let R be any ring; a nonempty set M is said to be an R-module (or, a module over R) if M is FINITELY GENERATED An R-module M is said to be finitely generated if there exist elements a1, , an M such that every m in M is of the form m = r1a1+ r2a2+ + rnan.

  6. UNIT III ALGEBRA all a, b A and F, (ab) = ( a)b = a( b). An associative ring A is called analgebraover F if A is a vector space over F such that for LINEAR TRANSFORMATION A linear transformation on V,over F, is an element of AF(V). INVERTIBLE there is an element S A(V)such that ST = TS = 1. We write S as T-1. Here 1 denotes the unit element of A(V). An element T in A(V)is invertible or regularif it is both right and left-invertible; that is, if SINGULAR An element in A(V)which is not regular is calledsingular. THEOREM V such that v T = 0. If V is finite-dimensional over F, then T A(V)is singular if and only if there existsa v 0 in RANGE If T A(V), then the range of T, VT,is defined by VT = {vT | v V}.

  7. RANK If V is finite-dimensional over F, then the rank of T is the dimension of VT, the range of T, over F. LEMMA 1. r(ST) r(T); 2. r(TS) r(T); (and so, r(ST) min {r( T), r(S) }) 3. r(ST) = r(TS) = r(T) for S regular in A(V). We denote the rank of T by r(T). If V is finite-dimensional over F then for S, T A(V) CHARACTERISTIC ROOT If T A(V) then F is called acharacteristic root (or eigenvalue) of T if - T is singular. THEOREM The element F is a characteristic root of T A(V)if and only if for some v 0 in V, vT = v. CHARACTERISTIC VECTOR The element 0 v V is called acharacteristic vector of T belonging to the characteristic root F if vT = V.

  8. UNIT IV SIMILAR element C A(V)such that T = CSC-1. The linear transformations S, T A(V)are said to be similar if there exists an invertible INVARIANT The subspace W of V is invariant under T A(V)if WT W. THEOREM a polynomial of degree n over F. If V is n-dimensional over F and if T A(V) has all its characteristic roots in F, then T satisfies INDEX OF NILPOTENCE If T A(V) is nilpotent, then k is called the index of nilpotence of T if Tk =0 but Tk-1 0. LEMMA If u V1is such that uT n1-k= 0, where 0 < k n1, then u = u0Tk for some u0 V1.

  9. CYCLIC called cyclic with respect to T if 1.MTm = (0), MTm-1 (0); 2. there is an element z M such that z, zT, ..., zTm-1 form a basis of M. If T A(V) is nilpotent, the subspace M of V, of dimension m, which is invariant under T, is THEOREM Two nilpotent linear transformations are similar if and only if they have the same invariants

  10. UNIT V TRACE The trace of A is the sum of the elements on the main diagonal of A. n We shall write the trace of A as tr A; if A = ( ij), then tr A= ii = i 1 LEMMA 1.tr( A) = tr A. 2.tr (A + B) = tr A + tr B. 3.tr (AB) = tr (BA). For A, B Fnand F, LEMMA ST- TS is nilpotent. If F is of characteristic 0 and if S and T, in AF(V), are such that ST- TS commutes with S, then TRANSPOSE each i and j. If A = ( ii) Fnthen the transpose of A, written as A', is the matrix A' = ( ij)where ij= jifor LEMMA 1. (A')' = A. 2. (A + B)' = A' + B'. 3. (AB)' = B'A'. For all A, B Fn,

  11. SYMMETRIC The matrix A is said to be a symmetric matrix if A' = A. SKEW SYMMETRIC The matrix A is said to be a skew-symmetric matrix if A'= -A. DETERMINANT in F. n (-1) 1 (1) 2 (2) ... n (n) If A = ( ij)then the determinant of A, written det A, is the element S LEMMA If two rows of A are equal(that is, vr = vsfor r s), then det A = 0. THEOREM For A, B Fndet (AB) = (det A) (det B). CRAMER S RULE 11x1 +.....+ 1nxn= 1 . . . n1x1+....+ nnxn= n is different from 0, then the solution of the system is given by xi = i / , where iis the determinant obtained from by replacing in the ithcolumn by 1, 2,..., n. If the determinant, , of the system of linear equations

More Related Content