This section (finally) defines the dimension of a subspace. Before the definition, it is proved that for a subspace W, any two bases of W each have the same number of vectors. So if one basis for W has two vectors, then any other basis for W also has two vectors. Then we can define the dimension of W is the number of vectors in any basis. So when one basis for W has two vectors, W is a subspace of dimension two.
In this section some big points are (1) the introduction of coordinates by means of a basis (2) for a matrix A, equality between the dimension of the range and the dimension of the row space (this dimension is called the rank of A) and (3) for a matrix A, a relationship between the dimension of the null space (the nullity of A) and the rank of A. Note that the nullity of A is the number of free variables in the solution set of AX = 0.
Finally, if you know the dimension of a subspace W is k, then you need less information about a set to determine whether or not it is a basis: If a set of k vectors in W is linearly independent, it is automatically a spanning set and thus a basis. Likewise, if a set of k vectors is a spanning set, then it must be linearly independent and thus a basis.
DO odd problems 1-9.
At last some completely new ideas! This is all about right angles and sets of vectors that are orthogonal (i.e. perpendicular) coordinate vectors. The key idea is that two vectors are orthogonal if their dot product equals zero. Then we use a lot of linear algebra ideas.
Some key facts: (1) An orthogonal set of non-zero vectors is always linearly independent. (2) There is a formula involving dot product that will solve the equation Ax = b, if the columns of A are orthogonal. (3) There is a method called Gram-Schmidt to convert any linearly independent set into an orthogonal set with the same span.
DO odd problems 1-7.
(Show your work)