Understanding Basis and Dimension in Linear Algebra
Basis and dimension are fundamental concepts in linear algebra. A basis is a set of vectors that can represent any vector in a given space through linear combinations. The dimension of a vector space is determined by the number of elements in its basis. Linear independence, spanning, finite-dimensional spaces, and the implications of linearly dependent vectors are all crucial aspects of understanding basis and dimension.
Download Presentation
Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
E N D
Presentation Transcript
BASIS AND DIMENSION Chapter 2
BASIS In linear algebra, a basis is a set of vectors in a given vector space with certain properties: One can get any vector in the vector space by multiplying each of the basis vectors by different numbers, and then adding them up. If any vector is removed from the basis, the property above is no longer satisfied. The dimension of a given vector space is the number of elements of the basis.
Basis: A set of n vectors, {v, v, v}, is a basis of some space S if these two conditions are true: {v , v , v } are linearly independent. {v , v , v } span the set S. In other words, Span{v ,v , v }=S For example, take the vectors (0,1) and (1,0), graphed below in green. These vectors are linearly independent, since there is no way to scale (0,1) into (1,0). Additionally, these two vectors span the entire 2-D plane, because you can rewrite any point in 2-D space as a linear combination of (0,1) and (1,0):
(0,1) and (1,0) therefore form a basis of R (This specific basis of (0,1) and (1,0) is called the standard basis). However, this is not the only basis of R possible. Take a look at the vectors (1,1) and (-1,1), graphed below. Pink vectors represent a few different linear combinations:
These two vectors are also linearly independent, since you cannot scale one vector into the other. From the pink linear combinations, it is also clear that (1,1) and (-1,1) spans all of R . Therefore, (1,1) and (-1,1) form another basis for R .
FINITE DIMENSIONAL A vector space V is called finite dimensional if some list of vectors in V spans the space. Any space which is not finite dimensional will be called as infinite dimensional space. Linearly Independent Let v1, vn be a set of vectors. This set of vectors is said to be linearly independent if a1v1+ +anvn=0 for a1, an F only if a1= =an=0. Empty set is also declared to be linearly independent. If a set of vectors is removed from a linearly independent list, the remaining vectors in the list are also independent. A list of vectors that is not linearly independent is known as linearly dependent. Any superset of a list of linearly dependent vectors will also be linearly dependent. The length of a list of linearly independent vectors < the length of the list of vectors that span the vector space.
In a list of linearly dependent vectors, it is possible to express at least one vector as a linear combination of the other vectors in that list. The converse also holds true which can be used to esatablish if a list of vectors is linearly independent or dependent. If we express a vector as linear combination of a list of vectors that are linearly independent, the coefficients used in the process are unique. Assume that two representations of the same vector exist
DIMENSION The dimension of a finite dimensional vector space is the length of the basis of the space. It is denoted by dim V. For instance, dim Fn=n since the basis has a size n. It follows that the dimension of a subspace of a vector space V is the dimension of the vector space V. We know that any line passing through origin is a subspace of R2 and the dim of the line is 1n the line are just scalar multiples of a vector. Click to add text
CO ORDINATE OF A VECTOR RELATIVE TO THE BASIS In linear algebra, a coordinate vector is a representation of a vector as an ordered list of numbers that describes the vector in terms of a particular ordered basis. Coordinates are always specified relative to an ordered basis.
A ring R has invariant basis number (IBN) if for all positive integers m and n, Rmisomorphic to Rn(as left R-modules) implies that m = n. Equivalently, this means there do not exist distinct positive integers m and n such that Rmis isomorphic to Rn. Rephrasing the definition of invariant basis number in terms of matrices, it says that, whenever A is an m-by- n matrix over R and B is an n-by-m matrix over R such that AB = I and BA = I, then m = n. This form reveals that the definition is left right symmetric, so it makes no difference whether we define IBN in terms of left or right modules; the two definitions are equivalent. Note that the isomorphisms in the definitions are not ring isomorphisms, they are module isomorphisms. INVARIANCE OF NUMBER OF ELEMENTS OF A BASIS.
ECHELON MATRIX
DIMENSION OF SUBSPACE Let WW be a subspace of VV. An immediate consequence of the above is that dim(W) dim(V)dim(W) dim(V). To see this, let w1, ,wmw1, ,wm be a basis for WW where m=dim(W)m=dim(W). As WW is a subspace of VV, {w1, ,wm}{w1, ,wm} is a linearly independent set in VV and its span, which is simply WW, is contained in VV. Extend this set to {w1, ,wm,u1, ,uk}{w1, ,wm,u1, ,uk} so that it gives a basis for VV. Then m+k=dim(V).m+k=dim(V).As k 0k 0, we get m dim(V)m dim(V), with strict inequality if and only if WW is a proper subspace of VV.