Google Search in:
Week 5: 28 Sep,  29 Sep,  30 Sep,  01 Oct,  02 Oct,

2250 Lecture Record Week 6 F2009

Last Modified: October 07, 2009, 16:56 MDT.    Today: October 23, 2017, 01:51 MDT.

Week 6, Sep 28 to Oct 2: Sections 3.5, 3.6, 4.1

28 Sep: Elementary matrices. Section 3.5

Lecture: Elementary matrices.
How to write a frame sequence as a product of elementary matrices.
Fundamental theorem on frame sequences
THEOREM. If A2 is the frame just after frame 1, A1, then A2=E A1
where E is the elementary matrix built from the identity matrix I by
applying one toolkit operation combo(s,t,c), swap(s,t) or mult(t,m).

THEOREM. If a frame sequence starts with A and ends with B, then
              B=(product of elementary matrices)A.

  The meaning: If A is the first frame and B a later frame in a sequence,
then there are elementary swap, combo and mult matrices E1 to En such
that the frame sequence A ==> B can be written as the matrix multiply
equation
                     B=En En-1 ... E1 A.
Web References: Elementary matrices
Slides: vector models and vector spaces (110.3 K, pdf, 03 Oct 2009)
Slides: Elementary matrix theorems (114.4 K, pdf, 03 Oct 2009)
Slides: Elementary matrices, vector spaces (35.8 K, pdf, 18 Feb 2007)

29 Sep: Inverses. Rank and nullity. Section 3.5.

Discussion of 3.4 problems.
Due today, maple lab L2.1.
Lecture: How to compute the inverse matrix from inverse = adjugate/determinant (2x2 case) and also by frame sequences. Inverse rules.
Web Reference: Construction of inverses. Theorems on inverses.
Slides: Inverse matrix, frame sequence method (71.6 K, pdf, 02 Oct 2009)
Slides: Matrix add, scalar multiply and matrix multiply (122.5 K, pdf, 02 Oct 2009)
Elementary matrices. Inverses of elementary matrices.
Solving B=E3 E2 E1 A for matrix A = (E3 E2 E1)^(-1) B.
About problem 3.5-44: This problem is the basis for the fundamental theorem on elementary matrices (see below). While 3.5-44 is a difficult technical proof, the extra credit problems on this subject replace the proofs by a calculation. See Xc3.5-44a and Xc3.5-44b.

How to do 3.5-16 in maple. Maple answer checks.
> with(linalg):#3.5-16
> A:=matrix([[1,-3,-3],[-1,1,2],[2,-3,-3]]);
> A1:=augment(A,diag(1,1,1));
> rref(A1);
> B:=inverse(A);
> A2:=addrow(A1,1,2,1);
> A3:=addrow(A2,1,3,-2);
> evalm(A&*B);

Lecture: Ideas of rank, nullity, dimension in examples.
More on Rank, Nullity, dimension, 3 possibilities, elimination algorithm.
Slides: Rank, nullity and elimination (111.6 K, pdf, 29 Sep 2009)
Answer to the question: What did I just do, when I found rref(A)?
Problems 3.4-17 to 3.4-22 are homogeneous systems Ax=0 with A in reduced echelon form. Apply the last frame algorithm then write the general solution in vector form.

29 Sep: Determinants. Section 3.6.

Due today, 3.4-20,30,34,40. See problem notes chapter 3
html: Problem notes F2009 (4.0 K, html, 22 Sep 2009)Lecture: Introduction to 3.6 determinant theory and Cramer's rule.
Sarrus' rule for 2x2 and 3x3. General Sarrus' rule with n-factorial arrows.
Lecture: Adjugate formula for the inverse. Review of Sarrus' Rules.
    slides for 3.6 determinant theory
    Slides: Determinants 2008 (167.7 K, pdf, 03 Oct 2009)
    Manuscript: Determinants, Cramer's rule, Cayley-Hamilton (186.5 K, pdf, 09 Aug 2009)
    Lecture: Methods for computing a determinant.
  1. Sarrus' rule, 2x2 and 3x3 cases.
  2. Four rules for determinants
    1. Triangular Rule
    2. Multiply rule
    3. Swap rule
    4. Combo rule
  3. Cofactor expansion. Details for the 3x3 case.
  4. Hybrid methods.
THEOREM. The 4 rules for computing any determinant can be compressed into two rules,
  1. The triangular rule, and
  2. det(EA)=det(A)det(A)
where E is an elementary combo, swap or mult matrix.

30 Sep: Determinants, Cramers Rule, Adjugate formula. Section 3.6

Drill: Triangular rule [one-arrow Sarrus' rule], combo, swap and mult rules. Cofactor rule.
Review: College algebra determinant definition and Sarrus' rule for 2x2 and 3x3 matrices.
Examples: Computing det(A) easily. When does det(A)=0?
THEOREM. Determinant values for elementary matrices:
    det(E)=1 for combo(s,t,c),
    det(E)=m for mult(t,m),
    det(E)=-1 for swap(s,t).
    Review of Main theorems:
  1. Computation by the 4 rules, cofactor expansion, hybrid methods.
  2. Determinant product theorem det(AB)=det(A)det(B).
  3. Cramer's Rule for solving Ax=b:
    x1 = delta1/delta, ... , xn = deltan/delta
  4. Adjugate formula: A adj(A) = adj(A) A = det(A) I
  5. Adjugate inverse formula inverse(A) = adjugate(A)/det(A).
    Lecture:
  1. Cofactor expansion of det(A).
      How to form minors, checkerboard signs and cofactors.
  2. Hybrid methods to evaluate det(A).
  3. How to use the 4 rules to compute det(A) for any size matrix.
  4. Computing determinants of sizes 3x3, 4x4, 5x5 and higher.
  5. Frame sequences and determinants.
    Formula for det(A) in terms of swap and mult operations.
  6. Special theorems for determinants having a zero row, duplicates rows or proportional rows.
  7. Elementary matrices and determinants. Determinant product rule for elementary matrices.
  8. Cramer's rule.
    How to form the matrix of cofactors and its transpose, the adjugate matrix.
  9. How to reduce the Four rules [triangular, swap , combo, mult] to Two Rules using the determinant product theorem det(AB)=det(A)det(B).

Slides: Determinants 2008 (167.7 K, pdf, 03 Oct 2009)
Manuscript: Determinants, Cramer's rule, Cayley-Hamilton (186.5 K, pdf, 09 Aug 2009)
html: Problem notes F2009 (4.0 K, html, 22 Sep 2009)

01 Oct: Fusi and Richins

Exam 1 starts at 7:00am, to give extra time for those who need it. If you cannot make this exam time, for any reason, then please call 801-581-6879 and also leave email for me to read.

02 Oct: Introduction to Chapter 4. Vector Space. Section 4.1.

   Exercises 3.4, 3.5 details.


Problems: 3.4-34 and 3.4-40. How to solve them.
  Cayley-Hamilton Theorem. Superposition proof.
  Web notes on these problems.
  Discussion of the Cayley-Hamilton theorem [Exercise 3.4-29; see also Section 6.3]
  Problem 3.4-29 is used in Problem 3.4-30.  How to solve problem 3.4-30.
The Cayley-Hamilton Theorem is a famous result in linear algebra which is the basis for solving systems of differential equations.
Manuscript: Determinants, Cramer's rule, Cayley-Hamilton (186.5 K, pdf, 09 Aug 2009)
Problem 3.4-40 is the superposition principle for the matrix equation Ax=b.
It is the analog of the differential equation relation y=y_h + y_p.
  Determinant product theorem
    det(EC)=det(E)det(C)  for elementary matrics E
    det(AB)=det(A)det(B) for any two square matrices A,B
    Proof details.
    Example.
Textbook: Chapter 4, sections 4.1 and 4.2.
Web references for chapter 4.
Slides: Vector space, subspace, independence (132.5 K, pdf, 03 Oct 2009)
Manuscript: Vector space, Independence, Basis, Dimension, Rank (206.4 K, pdf, 27 Feb 2007)
Slides: The pivot theorem and applications (131.9 K, pdf, 02 Oct 2009)
Slides: Rank, nullity and elimination (111.6 K, pdf, 29 Sep 2009)
Slides: Digital photos, Maxwell's RGB separations, visualization of matrix add (153.7 K, pdf, 16 Oct 2009)
Slides: More on digital photos, checkerboard analogy (109.5 K, pdf, 02 Oct 2009)
Slides: Orthogonality (87.2 K, pdf, 10 Mar 2008)
Transparencies: Ch4 Page 237+ slides, Exercises 4.1 to 4.4, some 4.9 (463.2 K, pdf, 25 Sep 2003)
html: Problem notes F2009 (4.0 K, html, 22 Sep 2009)
Lecture: Abstract vector spaces.
  Def: Vector==package of data items.
  Vectors are not arrows.
  The four vector models
    Fixed
    Triad i,j,k algebraic calculus model
    Physics and Engineering arrows
    Gibbs motion
  The 8-Property Toolkit
  Def: vector space, subspace
    Working set == subspace.
    Data set == Vector space
  Examples of vectors:
     Digital photos,
     Fourier coefficients,
     Taylor coefficients,
     Solutions to DE. Example: y=2exp(-x^2) for DE y'=-2xy, y(0)=2.
  RGB color separation and matrix add
  Intensity adjustments and scalar multiply
    Digital photos and matrix add, scalar multiply visualization.
    Slides: Digital photos, Maxwell's RGB separations, visualization of matrix add (153.7 K, pdf, 16 Oct 2009)
    Slides: More on digital photos, checkerboard analogy (109.5 K, pdf, 02 Oct 2009)
Four Vector Models: Fixed vectors, physics vectors i,j,k, engineering vectors (arrows), Gibbs vectors.
Slides: vector models and vector spaces (110.3 K, pdf, 03 Oct 2009)
Parallelogram law. Head minus tail rule.
The 8-property toolkit for vectors. Vector spaces. Reading: Section 4.1 in Edwards-Penney, especially the 8 properties.

The 7:30 class lectures fell short of the target, while the 12:25 class remained ahead of schedule. What appears here for this week is accurate for only the 12:25 class.

References for Chapters 3 and 4


    Slides: vector models and vector spaces (110.3 K, pdf, 03 Oct 2009)
    Slides: Vector space, subspace, independence (132.5 K, pdf, 03 Oct 2009)
    Manuscript: Vector space, Independence, Basis, Dimension, Rank (206.4 K, pdf, 27 Feb 2007)
    Manuscript: Linear equations, reduced echelon, three rules (45.8 K, pdf, 22 Sep 2006)
    Manuscript: Three rules, frame sequence, maple syntax (35.8 K, pdf, 25 Jan 2007)
    Manuscript: Linear algebraic equations, no matrices (292.7 K, pdf, 08 Mar 2009)
    Manuscript: Vectors and Matrices (266.8 K, pdf, 09 Aug 2009)
    Manuscript: Matrix Equations (162.6 K, pdf, 09 Aug 2009)
    Transparencies: Ch3 Page 149+, Exercises 3.1 to 3.6 (869.6 K, pdf, 25 Sep 2003)
    Transparencies: Ch4 Page 237+ slides, Exercises 4.1 to 4.4, some 4.7 (463.2 K, pdf, 25 Sep 2003)
    Slides: Elementary matrix theorems (114.4 K, pdf, 03 Oct 2009)
    Slides: Elementary matrices, vector spaces (35.8 K, pdf, 18 Feb 2007)
    Slides: Linear equations, reduced echelon, three rules (155.6 K, pdf, 06 Aug 2009)
    Slides: Infinitely many solutions case (93.8 K, pdf, 03 Oct 2009)
    Slides: No solution case (58.4 K, pdf, 03 Oct 2009)
    Slides: Unique solution case (86.0 K, pdf, 03 Oct 2009)
    Maple: Lab 5, Linear algebra (94.3 K, pdf, 19 Jul 2009)
    html: Problem notes F2009 (4.0 K, html, 22 Sep 2009)
    Slides: Determinants 2008 (167.7 K, pdf, 03 Oct 2009)
    Manuscript: Determinants, Cramer's rule, Cayley-Hamilton (186.5 K, pdf, 09 Aug 2009)
    Slides: Matrix add, scalar multiply and matrix multiply (122.5 K, pdf, 02 Oct 2009)
    Slides: Inverse matrix, frame sequence method (71.6 K, pdf, 02 Oct 2009)
    Slides: Digital photos, Maxwell's RGB separations, visualization of matrix add (153.7 K, pdf, 16 Oct 2009)
    Slides: More on digital photos, checkerboard analogy (109.5 K, pdf, 02 Oct 2009)
    Slides: Rank, nullity and elimination (111.6 K, pdf, 29 Sep 2009)
    Slides: Base atom, atom, basis for linear DE (85.4 K, pdf, 20 Oct 2009)
    Slides: Orthogonality (87.2 K, pdf, 10 Mar 2008)
    Slides: Partial fraction theory (121.5 K, pdf, 30 Aug 2009)
    Slides: The pivot theorem and applications (131.9 K, pdf, 02 Oct 2009)
    Text: Lawrence Page's pagerank algorithm (0.7 K, txt, 06 Oct 2008)
    Text: History of telecom companies (1.1 K, txt, 05 Oct 2008)