# Math 308Matrix Algebra with Applications

Autumn 2017

#### Time and Place

The course meets on Mondays, Wednesdays and Fridays.
Section E: 10:30-11:20am at DEM 004
Section F: 11:30-12:20pm at THO 119

#### Instructor

Name: Lucas Braune
Website: www.math.uw.edu/~lvhb
E-mail: lvhb 'at' uw.edu
Office hours: T-Th, 09:30-11:00am

#### Calendar

 Sep 27 Linear systems (row and column pictures). Elimination. Notes Sep 29 Matrix-vector multiplication. Singular systems. Echelon form. Notes Oct 2 Nonsingular matrices. Leading and free variables. Chemical reactions. Notes Oct 4 Computational cost. Poisson's equation. Sparse systems. Notes Oct 6 Abstract vector spaces. Subspaces. Notes Oct 9 Column space. Span. Nullspace. Notes Oct 11 Review: computation of N(A) and of particular solutions to Ax=b. Notes Oct 13 Linear independence. Review of span and nonsingular matrices. Notes Oct 16 Problems from an old Midterm exam. Oct 18 Midterm 1 Oct 20 Linear transformations. Notes Oct 23 The matrix of a linear transformation. One-to-one and onto. Notes Oct 25 Composition and matrix multiplication. Notes Oct 27 The matrix inverse. Notes Oct 30 Inverse via Gauss-Jordan. LU decomposition. Notes Nov 1 Inverse of a linear transformation. Basis and dimension. Notes Nov 3 Bases for row(A), col(A) and null(A). Notes Nov 6 Determinants: properties. Notes Nov 8 Determinants: the big formula, cofactor expansions. Notes Nov 10 No class: Veteran's day (observed) Nov 13 Review Notes Nov 15 Midterm 2 Nov 17 Eigenvalues, eigenvectors. Characteristic polynomial. Trace. Notes Nov 20 Diagonalization. Powers of a matrix. Fibonacci sequence. Notes Nov 22 Systems of ODEs. Markov matrices. Steady states. Notes Nov 24 No class: Thanksgiving Friday Nov 27 Transposes. Orthogonal vectors and subspaces. Notes Nov 29 Projections. Least squares. Notes Dec 1 Least squares. Fitting a line to data points. Notes Dec 4 Orthonormal bases. Orthogonal matrices. Gram-Schmidt. Notes Dec 6 QR decomposition. Eigenvalue computations. Fourier series. Notes Dec 8 Course review. Notes Dec 11 Final exam (section E) Dec 13 Final exam (section F)

#### Textbook

The textbook for Math 308 is Linear Algebra with Applications (second edition, with Webassign) by Jeffrey Holt. A loose-leaf version of the book is available at the University Bookstore. Alternatively, it is possible to buy digital access to the book through Webassign.

#### Homework

Homework for this class will be submitted through Webassign. Click here for information on Webassign put together for students by Jennifer Taggart.

The student that wishes practice for the exams may wish to look the Math 308 Exam Archive compiled by Kristin DeVleming. They are advised to bear in mind that exams from previous incarnations of Math 308 may concern material different from what we discuss in the present course.

 Homework 20% Midterm 1 20% Midterm 2 20% Final Exam 40%

The dates of the final exams are set by the university and cannot be changed. The final exam for section E will take place from 8:30 to 10:30am on Monday, December 11, at DEM 004. The final exam for section F will take place from 2:30 to 4:30pm on Wednesday, December 13, at THO 119.

#### Course description

Math 308 is a course on Linear Algebra. At its core, Linear Algebra is about solving systems of linear equations. We will begin by discussing Gaussian elimination, a method that can be used to solve any such system, and also one of the most important algorithms of pure and applied mathematics.

We will interpret a linear system as a single equation $Ax = b$, where $x$ is an unknown vector which, multiplied by a matrix $A$, yields a given vector $b$. This will lead us to the rules of multiplying matrices, and to the notion of a matrix inverse. When the inverse $A^{-1}$ of a matrix $A$ exists, the system $Ax=b$ is easy to solve: $x=A^{-1}b$!

In tandem with the algebra of matrices, we will study their geometry: how do vectors (or planes) move when you multiply them by a matrix? This point of view will lead us to the notion of a linear transformation, which much illuminates matrix multiplication. In carrying out this discussion, we will introduce ourselves to the geometric notions of linear independence, span and dimension of a linear subspace of $\mathbb R^n$.

Next, we will dispel the mystery of determinants. Geometrically, determinants measure volumes. Alternatively, they can be seen as a gadget that takes $n$ size-$n$ vectors and outputs a number. This gadget can be completely understood in terms of three easy-to-remeber rules of algebra.

Using determinants, we will introduce eigenvalues and eigenvectors. In a certain sense, these are numbers and vectors allow us to completely understand any matrix. This point which will be illustrated by applications to recursion problems and systems of ODEs.

To conclude the course, we will discuss lengths and angles. It will be clear that, without much creativity, one can make sense of these notions in dimensions 4 and higher. This will open way to interesting applications. Time permitting, we will discuss least squares regression, a technique that will alow us to answer the question: "What is the line that best approximates 30 given data points?".

(Another landmark application of the notion of orthogonality is Fourier analysis. This is used for example in MP3 compression to discard from an audio signal its components in frequencies that cannot be heard by the human ear. Interested students can talk to me if they want to learn more about this or other applications of linear algebra that we are [unfortunately] unable to cover during lectures. A summary of what the student finds can be worth extra points for them!)

Intersperced with the development of the theory sketched above, lectures will include as many of the following applications as time permits: numerical solution of the Laplace equation (heat distribution), Markov chains (population dynamics and stock markets), and graphs and networks (circuits, supply chains).