1. Let A be a real matrix. Prove that the following are equivalent:
SOLUTION. I'll start by showing (i)
(ii). Let v1, v2, ..., vn denote the columns of A.
The (i,j)-entry of AtA is the dot product of the ith row of
At with the jth column of A. Of course, the ith row of
At is the same as the ith column of A, so the (i,j)-entry
of At A is
.
Thus
At A = I if
and only if
Now I'll show (i)
(iii).
Finally, I'll show (iii) (ii). Given (iii), I know that , but Aei and Aej are the ith and jth columns of A, respectively. Since the vectors e1, ..., en are orthonormal, then so are the vectors Ae1 = v1, ..., Aen = vn.
2. Draw a wallpaper pattern with a cyclic group for its point group; draw a wallpaper pattern with a dihedral group for its point group.
SOLUTION. See page 173 for lots of examples. The cyclic group pictures don't have any reflections; the dihedral group pictures will have reflections.
3(a).
Let G be a group, S a G-set, x an element of S. Recall that
the orbit of x, Ox, is this subset of S:
SOLUTION. To show that is well-defined, I have to show that if aGx = bGx, then ; i.e., I have to show that ax = bx. Well, aGx = bGx if and only if , in which case (b-1a) x = x. ``Multiply'' both sides by b: ax = bx, as desired.
Running this argument backwards shows that if , then aGx = bGx: is one-to-one.
Finally, I have to show that is onto. Given , then y=gx for some ; thus .
3(b). Let p be a prime number. Recall that a p-group is a group which has order pn for some n. Prove that the center of a p-group has order larger than 1.
SOLUTION. Let G be a p-group, and consider the class
equation for G:
4.
Let V be a finite-dimensional real vector space and let
be a symmetric positive definite bilinear form on
V. For any subspace W of V, let
be the orthogonal
complement of W:
(a) .
SOLUTION. Assume that . Since , then w is orthogonal to everything in W; in particular, . Since the form is positive definite, though, is positive for any nonzero vector w; thus w must be zero. So the only vector in both W and is the zero vector.
(b) Every vector can be written in the form v = w + u where and . [Hint: choose an orthonormal basis for W.]
SOLUTION. Let w1, ..., wr be an orthonormal basis
for W. (I mean orthonormal with respect to the form
.
I know that there is an orthonormal basis since
the form is positive definite--use the Gram-Schmidt procedure, for
instance.) I want to write
In other words, for any ,
5. Let A be a real symmetric matrix. We know that there is an invertible matrix Q so that QAQt is diagonal, such that each diagonal entry is either 1, -1, or 0. Recall that in this situation, the signature of A is the pair of numbers (p,m), where p is the number of 1's on the diagonal of QAQt, and mis the number of -1's. Show that p is equal to the number of positive eigenvalues of A and m is equal to the number of negative eigenvalues. [Hint: use the spectral theorem.]
SOLUTION. By the spectral theorem, there is an orthogonal
matrix P so that
PAPt = PAP-1 is diagonal, with the
eigenvalues as the diagonal entries. Let
,
...,
be the eigenvalues. Define a matrix C as
follows: C is diagonal, and the (i,i)-entry is
6. Let U(n) be the set of complex unitary matrices. Show that the product of two unitary matrices is unitary, and the inverse of a unitary matrix is unitary; in other words, show that U(n) is a subgroup of GLn(C).
SOLUTION. Recall that a matrix A is unitary if and only if AA* = I. If A and B are unitary, then (AB)(AB)* = ABB*A* = A(BB*)A* = A(I)A* = I, so AB is unitary. If A is unitary, then A is invertible with A-1 = A*; I need to check that A* is unitary: A*(A*)* = A*A. We know from last quarter that if C and D are square matrices with CD = I, then DC=I; thus A*A=I, as desired.
7. Let A be a real symmetric matrix, and define a bilinear form on Rn by . Of course, there is also the ordinary dot product .
True or false: If A is a real symmetric matrix, then the eigenvectors for A are orthogonal with respect to both the ordinary dot product and the bilinear form . Give a proof or a counterexample.
SOLUTION. This is true. The spectral theorem says that the
eigenvectors are orthogonal with respect to the ordinary dot product.
Now assume that X and Y are eigenvectors, with
.
Then
8. Describe the Gram-Schmidt procedure.
SOLUTION. This is a procedure for constructing orthonormal bases, given a symmetric positive definite bilinear form on a finite-dimensional real vector space V. More precisely, you start with any basis for V, and the Gram-Schmidt procedure tells you how to alter it, inductively, to get an orthonormal basis.
Even more precisely, suppose V is a vector space with bilinear form
,
satisfying the conditions above. Let
be a basis for V. To construct an orthonormal basis
,
first normalize v1: let
Suppose now that we've constructed mutually orthogonal unit vectors
w1, ..., wk-1 out of the vi's. Define a vector was follows:
Go to John Palmieri's home page.