Assume \(T\) is self adjoint. First we show that eigenvectors with distinct eigenvalues are orthogonal. To this end, suppose we have \(T(\boldv)=\lambda\boldv\) and \(T(\boldv')=\lambda'\boldv'\text{,}\) where \(\lambda\ne \lambda'\text{.}\) Using the definition of self-adjoint, we have
\begin{align*}
\angvec{T(\boldv), \boldv'}=\angvec{\boldv, T(\boldv')} \amp\implies \angvec{\lambda\boldv, \boldv'}=\angvec{\boldv, \lambda'\boldv'} \\
\amp\implies \lambda\angvec{\boldv, \boldv'}=\lambda'\angvec{\boldv, \boldv'} \\
\amp \implies \angvec{\boldv, \boldv'}=0 \amp (\lambda\ne \lambda')\text{.}
\end{align*}
We now prove by induction on
\(\dim V\) that if
\(T\) is self-adjoint, then
\(T\) is diagonalizable. The base case
\(\dim V=1\) is trivial. Assume the result is true of any
\(n\)-dimensional inner product space, and suppose
\(\dim V=n+1\text{.}\) By
Corollary 5.6.5 there is a nonzero
\(\boldv\in V\) with
\(T(\boldv)=\lambda\boldv\text{.}\) Let
\(W=\Span\{\boldv\}\text{.}\) Since
\(\dim W=1\text{,}\) we have
\(\dim W^\perp=\dim V-1=n\text{.}\) The following two facts are crucial for the rest of the argument and are left as an exercise (
5.6.3.10).
For all \(\boldv\in W^\perp\) we have \(T(\boldv)\in W^\perp\text{,}\) and thus by restricting \(T\) to \(W^\perp\) we get a linear transformation \(T\vert_{W^{\perp}}\colon W^\perp\rightarrow W^\perp\text{.}\)
The restriction \(T\vert_{W^\perp}\) is self-adjoint, considered as a linear transformation of the inner product space \(W^\perp\text{.}\) Here the inner product on the subspace \(W^\perp\) is inherited from \((V, \angvec{\, , \,})\) by restriction.
Now since \(\dim W^\perp=n-1\) and \(T\vert_{W^\perp}\) is self-adjoint, we may assume by induction that \(T\vert_{W^\perp}\) has an eigenbasis \(B'=(\boldv_1, \boldv_2,\dots, \boldv_n)\text{.}\) We claim that \(B=(\boldv, \boldv_2, \dots, \boldv_n)\) is an eigenbasis of \(V\text{.}\) Since by definition \(T\vert_{W^\perp}(\boldw')=T(\boldw')\) for all \(\boldw'\in W^\perp\text{,}\) we see that the vectors \(\boldv_i\) are also eigenvectors of \(T\text{,}\) and thus that \(B\) consists of eigenvectors. To show \(B\) is a basis it is enought to prove linear independence, since \(\dim V=n+1\text{.}\) Suppose we have
\begin{equation*}
c\boldv+c_1\boldv_1+\cdots +c_n\boldv_n=\boldzero
\end{equation*}
for scalars \(c, c_i\in \R\text{.}\) Taking the inner product with \(\boldv\text{,}\) we have :
\begin{align*}
c\boldv+c_1\boldv_1+\cdots +c_n\boldv_n=\boldzero\amp\implies\\
\langle\boldv, c\boldv+c_1\boldv_1+\cdots +c_n\boldv_n\rangle=\langle\boldv, \boldzero\rangle
\amp\implies c\angvec{\boldv, \boldv}+\sum_{i=1}^nc_i\angvec{\boldv, \boldv_i}=0 \\
\amp \implies
c\angvec{\boldv, \boldv}=0 \amp (\angvec{\boldv, \boldv_i}=0)\\
\amp \implies c=0 \amp (\angvec{\boldv, \boldv}\ne 0)\text{.}
\end{align*}
It follows that we have
\begin{equation*}
c_1\boldv_1+\cdots +c_n\boldv_n=\boldzero\text{,}
\end{equation*}
and thus \(c_i=0\) for all \(1\leq i\leq n\text{,}\) since \(B'\) is linearly independent. Having proved that \(B\) is an eigenbasis, we conclude that \(T\) is diagonalizable.