Edwards-Penney Chapter 4, sections 4.5, 4.6, 4.7 Topics, Definitions, Theorems 4.5 ==== DEF. Column space = span of columns of matrix A, Row space = span of the rows of matrix A. DEF. Row rank, column rank are dimensions of corresponding spaces THEOREM 1. The nonzero vectors of an echelon matrix (reduced or not) are linearly independent and form a basis for the row space. THEOREM 2. Two matrices in a combo-swap-mult sequence have the same row space. ALGORITHM 1. To find a basis for the row space of matrix A, choose the nonzero vectors from any echelon matrix found by combo-swap-mult operations. THEOREM. A row space of row rank 1 or higher does not have a unique basis: it has infinitely many bases. The same is true for a column space of column rank 1 or higher. EXAMPLE. The nonzero rows of a 4x5 echelon matrix are independent. EXAMPLE. Find a row space basis for a 4x5 matrix A. DEF. A pivot column of a matrix B is the matching column in RREF(B) containing a leading one. The pivot column locations can be found from any echelon matrix obtained from B by combo-swap-mult operations. PIVOT THEOREM. [EXAMPLE 3] 1. The pivot columns of a matrix A are independent and form a basis for the column space of A. 2. Any non-pivot column of A is a linear combination of the pivot columns of A. ALGORITHM 2. To find a basis for the span of the columns of A (the column space of A), develop a combo-swap-mult sequence to find an echelon form C. The columns of A which correspond to the pivot columns in matrix C are a basis for the column space of A. THEOREM. The algorithm can be written as span(cols of A) = span(pivot cols of A). WARNING: The pivot columns of C are not in general a basis for the column space of A. This is because swaps change the row location of a leading one. Can you find an example? EXAMPLE. Find the pivot columns of a 4x5 matrix A and report a basis for the column space. EXAMPLE. Let W=span(S) where S is a set of 4 vectors in R^4. Find a largest subset S1 of S such that span(S) = span(S1). SOLUTION: S1 = pivot columns of A, where columns of A are the vectors in set S. THEOREM 3. The row rank and the column rank of a matrix are equal. Literature: rank(A) = rank(A^T) THEOREM [Superposition] The equation Ax=b has a solution x=x0 if and only if b is in the column space of A. The general solution of Ax=b is of the form x=x0+xh, where x0 is any particular solution of Ax=b and xh is the general solution of the homogeneous problem Ax=0. DEF. A subset of rows in matrix A is called IRREDUNDANT provided they are independent. This is equivalent to the corresponding columns of A^T being among the pivot columns of A^T. The notion is used to find redundant equations, that is, the rows of A and hence those equations in Ax=0 which can be ignored. 4.6 ==== DEF. Dot product of two vectors. THEOREM. [properties of the dot product] u.v = v.u u.(v+w)=u.v + u.w (cu).v= c(u.v) for constants c u.u >= 0 u.u = 0 if and only if u=0 DEF. An inner product on a vector space V is a function (u,v) ==> R satisfying the following properties 1. (u,v)=(v,u) 2. (u,v+w)=(u,v)+(u,w) 3. (cu,v)=c(u,v) for constants c 4. (u,u)>=0 and (u,u)=0 if and only if u=0 THEOREM. u.v=u^T v (a matrix product on the right) DEF. |u|=sqrt(u.u) THEOREM 1. [Cauchy-Schwartz inequality] |u.v| <= |u| |v| DEF. Angle theta between two nonzero vectors: cos(theta) = u.v/(|u| |v|) DEF. u and v are orthogonal means u.v=0. Either vector can be zero. THEOREM 2. [triangle inequality] |u+v| <= |u| + |v| THEOREM 3. A set of nonzero pairwise orthogonal vectors is linearly independent. Results on orthogonal complements and the Fredholm Alternative are not studied in 2250, but in the math course 2270. So little is done in section 4.6 that it is perhaps best to skip it and refer to Strang's book, the textbook for 2270. All results in 4.6 distill to the single fact "row space of A is perpendicular to the kernel of A" which appears in THEOREM 5. From this result is is possible to obtain (easily) the results on the dimensions of Strang's Four Subspaces appearing on the cover of his textbook and standard bases for these four subspaces. 4.7 ==== EXAMPLE 1. Vector space of all mxn matrices. A realization is the set of all digital photos of a fixed resolution. How to find a basis. EXAMPLE 2. Vector space of all 2x2 matrices of the form Matrix([[a,-b,[b,a]]) which is another way to present the complex numbers a+ib. Matrix multiply corresponds to multiplying complex numbers. EXAMPLE 3. Vector space of all real-valued functions on the line. Subspace W1=span(exp(x),exp(2x)). Subspace W2=span(all polynomials) Subspace W3=span(1,cos(x)^2,sin(x)^2) EXAMPLE 4. Vector space of all real-valued functions on the line. Subspace W[n]=span(all polynomials of degree n or less) Sampling test for independence applied to 1,x,x^2. Vandermonde matrix. EXAMPLE 5. Partial fractions application. EXAMPLE 6. Vector space of all real-valued functions on the line. Subspace W=span(1,x^2,3x^2-1,5x^3-3x) Show that the listed polynomials are independent. Conclude that W=W[3]=span(all polynomials of degree<=3) DEF. The solution space W of a constant-coefficient linear homogeneous differential equation, like y''-2y'=0 or y''=0, is the set of all solutions of the given differential equation. THEOREM. The solution space W of a constant-coefficient linear homogeneous differential equation is a subspace of the vector space V of all real-valued functions on the line. EXAMPLE 7. Vector space of all real-valued functions on the line. W=solution space of differential equation y''=0. Show that W=span(1,x)=W[1]=span(all polynomials of degree<=1) METHOD: Section 1.2 EXAMPLE 8. Vector space of all real-valued functions on the line. W=solution space of differential equation y''-2y'=0 Show that W=span(1,exp(2x)) and that the given functions are independent, hence form a basis for W. EXAMPLE 9. Vector space of all real-valued functions on the line. Subspace W is the solution space of y''- y = 0 Show that W=span(cosh(x),sinh(x)) where cosh(x) = (1/2)(exp(x)+exp(-x)) sinh(x) = (1/2)(exp(x)-exp(-x)) Show that W=span(exp(x),exp(-x)). Show that the listed functions, in each case, are independent and form a basis for the solution space W.