Mathematics 403 Final Exam Solutions

1. Let A be a real $n \times n$ matrix. Prove that the following are equivalent:

(i)
A is orthogonal (meaning that At = A-1, or equivalently, AtA = I).
(ii)
The columns of A are mutually orthogonal unit vectors (with respect to the standard dot product).
(iii)
A preserves the dot product (meaning that $(X \cdot Y) = (AX \cdot AY)$ for all $X,Y \in \mathbf{R}^{n}$).

SOLUTION. I'll start by showing (i) $\Longleftrightarrow$(ii). Let v1, v2, ..., vn denote the columns of A. The (i,j)-entry of AtA is the dot product of the ith row of At with the jth column of A. Of course, the ith row of At is the same as the ith column of A, so the (i,j)-entry of At A is $(v_{i} \cdot v_{j})$. Thus At A = I if and only if

\begin{displaymath}(v_{i} \cdot v_{j}) = \begin{cases}
1 & \text{if $i=j$ ,} \\
0 & \text{if $i \neq j$ }.
\end{cases}\end{displaymath}

In other words, At A = I if and only if the vectors v1, ..., vn are mutually orthogonal unit vectors.

Now I'll show (i) $\Longrightarrow$ (iii).

\begin{displaymath}(AX \cdot AY) = (AX)^{t} AY = X^{t} A^{t} A Y,
\end{displaymath}

so if AtA = I, this equals $X^{t}Y = (X \cdot Y)$, as desired.

Finally, I'll show (iii) $\Longrightarrow$ (ii). Given (iii), I know that $(e_{i} \cdot e_{j}) = (Ae_{i} \cdot Ae_{j})$, but Aei and Aej are the ith and jth columns of A, respectively. Since the vectors e1, ..., en are orthonormal, then so are the vectors Ae1 = v1, ..., Aen = vn.

2. Draw a wallpaper pattern with a cyclic group for its point group; draw a wallpaper pattern with a dihedral group for its point group.

SOLUTION. See page 173 for lots of examples. The cyclic group pictures don't have any reflections; the dihedral group pictures will have reflections.

3(a). Let G be a group, S a G-set, x an element of S. Recall that the orbit of x, Ox, is this subset of S:

\begin{displaymath}O_{x} = \{ y \in S \,:\,y = gx \ \text{for some $g \in G$ } \}.
\end{displaymath}

The stabilizer of x, Gx, is this subgroup of G:

\begin{displaymath}G_{x} = \{ h \in G \,:\,hx = x \}.
\end{displaymath}

Define a map $\phi : G/G_{x} \longrightarrow O_{x}$ by $\phi (aG_{x})
= ax$. Prove that $\phi$ is a well-defined bijection.

SOLUTION. To show that $\phi$ is well-defined, I have to show that if aGx = bGx, then $\phi (aG_{x}) = \phi (bG_{x})$; i.e., I have to show that ax = bx. Well, aGx = bGx if and only if $b^{-1}a \in G_{x}$, in which case (b-1a) x = x. ``Multiply'' both sides by b: ax = bx, as desired.

Running this argument backwards shows that if $\phi (aG_{x}) = \phi (bG_{x})$, then aGx = bGx: $\phi$ is one-to-one.

Finally, I have to show that $\phi$ is onto. Given $y \in O_{x}$, then y=gx for some $g \in G$; thus $y = \phi (gG_{x})$.

3(b). Let p be a prime number. Recall that a p-group is a group which has order pn for some n. Prove that the center of a p-group has order larger than 1.

SOLUTION. Let G be a p-group, and consider the class equation for G:

\begin{displaymath}p^{n} = 1 + \text{(other terms)}.
\end{displaymath}

Each of the other terms must divide the order of G, and so must be pi for some $i \leq n$. p divides the left-hand side and p divides each term pi where $i \geq 1$, but p doesn't divide 1, so there must be more than one 1 on the right side of the equation: the class equation must look like

\begin{displaymath}p^{n} = \underbrace{1 + 1 + \dotsb + 1}_{j} + \text{(other terms)},
\end{displaymath}

where here the other terms are of the form pi with $1 \leq i \leq
n$. The number j must be larger than 1. (In fact, it must be a multiple of p, but I don't really care about that right now.) Now, remember that the terms in the class equation are the sizes of conjugacy classes. If the conjugacy class of an element x has exactly one element in it, that element must be x (since every element is always conjugate to itself: x = 1 x 1-1). Thus gxg-1 = x for every $g \in G$; equivalently, gx = xg; equivalently, x is in the center of G. Thus the center of G has j elements, where $j \geq 2$.

4. Let V be a finite-dimensional real vector space and let $\langle \;, \, \rangle$ be a symmetric positive definite bilinear form on V. For any subspace W of V, let $W^{\perp}$ be the orthogonal complement of W:

\begin{displaymath}W^{\perp} = \{ u \,:\,\langle u, w \rangle=0 \ \text{for all} \ w \in W \}.
\end{displaymath}

Show that $V = W \oplus W^{\perp}$; in other words, show:

(a) $W \cap W^{\perp} = 0$.

SOLUTION. Assume that $w \in W \cap W^{\perp}$. Since $w \in
W^{\perp}$, then w is orthogonal to everything in W; in particular, $\langle w, w \rangle = 0$. Since the form is positive definite, though, $\langle w, w \rangle$ is positive for any nonzero vector w; thus w must be zero. So the only vector in both W and $W^{\perp}$ is the zero vector.

(b) Every vector $v \in V$ can be written in the form v = w + u where $w
\in W$ and $u \in W^{\perp}$. [Hint: choose an orthonormal basis for W.]

SOLUTION. Let w1, ..., wr be an orthonormal basis for W. (I mean orthonormal with respect to the form $\langle \;, \, \rangle$. I know that there is an orthonormal basis since the form is positive definite--use the Gram-Schmidt procedure, for instance.) I want to write

\begin{displaymath}v = c_{1} w_{1} + \dotsb + c_{r} w_{r} + u,
\end{displaymath}

where $u \in W^{\perp}$. In other words, I want to choose the scalars ci so that

\begin{displaymath}v - c_{1} w_{1} - \dotsb - c_{r} w_{r} \in W^{\perp}.
\end{displaymath}

If I want to check that a vector is in $W^{\perp}$, it suffices to check that it's orthogonal to each wi, so I compute this:

\begin{displaymath}\langle w_{i}, v - c_{1} w_{1} - \dotsb - c_{r} w_{r} \rangle...
...angle w_{i}, w_{i} \rangle =
\langle w_{i}, v \rangle - c_{i}.
\end{displaymath}

(I'm using the fact that the wi's are orthonormal.) So if I want this to be zero, I set $c_{i} = \langle w_{i}, v \rangle$.

In other words, for any $v \in V$,

\begin{displaymath}u = v - \sum_{i=1}^{r} \langle w_{i}, v \rangle w_{i} \in W^{\perp},
\end{displaymath}

so v can be written as the sum of something in W (the sum above) with something in $W^{\perp}$ (the vector u).

5. Let A be a real symmetric $n \times n$ matrix. We know that there is an invertible matrix Q so that QAQt is diagonal, such that each diagonal entry is either 1, -1, or 0. Recall that in this situation, the signature of A is the pair of numbers (p,m), where p is the number of 1's on the diagonal of QAQt, and mis the number of -1's. Show that p is equal to the number of positive eigenvalues of A and m is equal to the number of negative eigenvalues. [Hint: use the spectral theorem.]

SOLUTION. By the spectral theorem, there is an orthogonal matrix P so that PAPt = PAP-1 is diagonal, with the eigenvalues as the diagonal entries. Let $\lambda_{1}$, ..., $\lambda_{n}$ be the eigenvalues. Define a matrix C as follows: C is diagonal, and the (i,i)-entry is

\begin{displaymath}\begin{array}{cc}
1/\sqrt{\vert\lambda_{i}\vert} & \text{if $...
...i} \neq 0$ ,} \\
1 & \text{if $\lambda_{i} = 0$ .}
\end{array}\end{displaymath}

Then C is invertible and Ct = C. Now look at (CP) A (CP)t = C(PAPt)Ct: since PAPt is diagonal with ith diagonal entry $\lambda_{i}$, then C(PAPt)Ct is diagonal with ith diagonal entry

\begin{displaymath}\begin{array}{cc}
1 & \text{if $\lambda_{i} > 0$ ,} \\
-1 & ...
...da_{i}< 0$ ,} \\
0 & \text{if $\lambda_{i} = 0$ .}
\end{array}\end{displaymath}

CP is invertible, so let Q=CP: then QAQt is of the right form, and it has signature (p,m), where p is the number of positive eigenvalues of A and m is the number of negative eigenvalues.

6. Let U(n) be the set of complex $n \times n$ unitary matrices. Show that the product of two unitary matrices is unitary, and the inverse of a unitary matrix is unitary; in other words, show that U(n) is a subgroup of GLn(C).

SOLUTION. Recall that a matrix A is unitary if and only if AA* = I. If A and B are unitary, then (AB)(AB)* = ABB*A* = A(BB*)A* = A(I)A* = I, so AB is unitary. If A is unitary, then A is invertible with A-1 = A*; I need to check that A* is unitary: A*(A*)* = A*A. We know from last quarter that if C and D are square matrices with CD = I, then DC=I; thus A*A=I, as desired.

7. Let A be a real symmetric matrix, and define a bilinear form $\langle \;, \, \rangle$ on Rn by $\langle X, Y \rangle = X^{t}AY$. Of course, there is also the ordinary dot product $(X \cdot Y) =
X^{t} Y$.

True or false: If A is a real symmetric matrix, then the eigenvectors for A are orthogonal with respect to both the ordinary dot product $(\, \cdot \,)$ and the bilinear form $\langle \;, \, \rangle$. Give a proof or a counterexample.

SOLUTION. This is true. The spectral theorem says that the eigenvectors are orthogonal with respect to the ordinary dot product. Now assume that X and Y are eigenvectors, with $AY = \lambda Y$. Then

\begin{displaymath}\langle X, Y \rangle = X^{t} A Y = X^{t} (\lambda Y) = \lambda (X^{t}Y) =
\lambda (X \cdot Y).
\end{displaymath}

Since X and Y are orthogonal with respect to the ordinary dot product, this is zero; hence they're orthogonal with respect to the form defined by A, too.

8. Describe the Gram-Schmidt procedure.

SOLUTION. This is a procedure for constructing orthonormal bases, given a symmetric positive definite bilinear form on a finite-dimensional real vector space V. More precisely, you start with any basis for V, and the Gram-Schmidt procedure tells you how to alter it, inductively, to get an orthonormal basis.

Even more precisely, suppose V is a vector space with bilinear form $\langle \;, \, \rangle$, satisfying the conditions above. Let $(v_{1},
\dotsc, v_{n})$ be a basis for V. To construct an orthonormal basis $(w_{1}, \dotsc, w_{n})$, first normalize v1: let

\begin{displaymath}w_{1} = \frac{1}{\sqrt{\langle v_{1}, v_{1} \rangle}} v_{1}.
\end{displaymath}

Then w1 is a unit vector.

Suppose now that we've constructed mutually orthogonal unit vectors w1, ..., wk-1 out of the vi's. Define a vector was follows:

\begin{displaymath}w = v_{k} - \sum_{i=1}^{k-1} \langle v_{k}, w_{i} \rangle w_{i}.
\end{displaymath}

Then w is orthogonal to each wi, so normalize it to get wk:

\begin{displaymath}w_{k} = \frac{1}{\sqrt{\langle w, w \rangle}} w.
\end{displaymath}



 
Go to the Math 403 home page.

Go to John Palmieri's home page.