Skip to main content
Logo image

Section 4.2 Orthogonal bases

Subsection 4.2.1 Orthogonal vectors and sets

Definition 4.2.1. Orthogonality.

Let \((V,\langle \ , \rangle)\) be an inner product space. Vectors \(\boldv, \boldw\in V\) are orthogonal if \(\langle \boldv, \boldw\rangle =0\text{.}\)
Let \(S\subseteq V\) be a set of nonzero vectors.
  • The set \(S\) is orthogonal if \(\langle\boldv,\boldw \rangle=0\) for all \(\boldv\ne\boldw\in S\text{.}\) We say that the elements of \(S\) are pairwise orthogonal in this case.
  • The set \(S\) is orthonormal if it is both orthogonal and satisfies \(\norm{\boldv}=1\) for all \(\boldv\in S\text{:}\) i.e., \(S\) consists of pairwise orthogonal unit vectors.
Given any distinct elements \(\boldv_1, \boldv_2, \dots, \boldv_r\in S\text{,}\) we have
\begin{align*} c_1\boldv_1 +c_2\boldv_2+\cdots +c_r\boldv_r=\boldzero\amp \Rightarrow\amp \langle c_1\boldv_1 +c_2\boldv_2 +\cdots +c_r\boldv_r,\boldv_i\rangle=\langle\boldzero,\boldv_i\rangle\\ \amp \Rightarrow\amp c_1\langle\boldv_1,\boldv_i\rangle +c_2\langle \boldv_2,\boldv_i\rangle +\cdots +c_r\langle\boldv_r,\boldv_i\rangle=0\\ \amp \Rightarrow\amp c_i\langle \boldv_i,\boldv_i\rangle=0 \ \text{ (since \(\langle\boldv_j,\boldv_i\rangle= 0\) for \(j\ne i\)) }\\ \amp \Rightarrow\amp c_i=0 \text{ (since \(\langle\boldv_i,\boldv_i\rangle\ne 0\)) }\text{.} \end{align*}
This proves that \(S\) is linearly independent.

Example 4.2.3.

Show that the set \(S=\{\boldx_1=(1,1,1),\boldx_2=(1,2,-3), \boldx_3=(-5,4,1)\}\) is orthogonal with respect to the dot product. Explain why it follows that \(S\) is a basis of \(\R^3\text{.}\)
Solution.
A simple computation shows \(\boldx_i\cdot \boldx_j=0\) for all \(1\leq i\ne j\leq 3\text{,}\) showing that \(S\) is orthogonal. Theorem 4.2.2 implies \(S\) is linearly independent. Since \(\val{S}=\dim \R^3=3\text{,}\) it follows from Corollary 3.7.9 that \(S\) is a basis.

Example 4.2.4.

Let \(V=C([0,2\pi])\) with integral inner product \(\langle f, g\rangle=\int_0^{2\pi} f(x)g(x) \, dx\text{,}\) and let
\begin{equation*} S=\{1, \cos(x),\sin(x),\cos(2x),\sin(2x), \dots\}=\{\cos(nx)\colon n\in\Z_{>0}\}\cup\{\sin(mx)\colon m\in\Z_{>0}\}\text{,} \end{equation*}
where the element \(1\in S\) is understood as the constant function \(f(x)=1\) for all \(x\in [0,2\pi]\text{.}\) Show that \(S\) is orthogonal and hence linearly independent.
Solution.
First observe that
\begin{align*} \angvec{1,1} \amp = \int_0^{2\pi} 1\, dx=2\pi \\ \angvec{1, \cos n x} \amp= \int_0^{2\pi}\cos n x\, dx=0 \\ \angvec{1, \sin n x} \amp= \int_0^{2\pi}\sin n x\, dx=0 \end{align*}
for all \(n\text{.}\) (Note: since \(\angvec{1,1}=2\pi\ne 1\text{,}\) the set \(S\) is not orthonormal. ) Next, using the trigonometric identities
\begin{align*} \cos\alpha\cos\beta \amp =\frac{1}{2}(\cos(\alpha-\beta)+\cos(\alpha+\beta))\\ \sin\alpha\sin\beta \amp=\frac{1}{2}(\cos(\alpha-\beta)-\cos(\alpha+\beta)) \\ \sin\alpha\cos\beta \amp =\frac{1}{2}(\sin(\alpha-\beta)+\sin(\alpha+\beta)) \end{align*}
it follows that
\begin{align*} \langle \cos(nx),\cos(mx)\rangle=\int_0^{2\pi}\cos(nx)\cos(mx)\, dx\amp =\begin{cases} 0\amp \text{ if \(n\ne m\) }\\ \pi\amp \text{ if \(n=m\) } \end{cases}\\ \langle \sin(nx),\sin(mx)\rangle=\int_0^{2\pi}\sin(nx)\sin(mx)\, dx\amp =\begin{cases} 0\amp \text{ if \(n\ne m\) }\\ \pi\amp \text{ if \(n=m\) } \end{cases}\\ \langle \cos(nx),\sin(mx)\rangle=\int_0^{2\pi}\cos(nx)\sin(mx)\, dx\amp =0 \text{ for any \(n,m\) }\text{.} \end{align*}

Subsection 4.2.2 Orthogonal bases

Given an inner product space \(V\) we will see that working with orthogonal sets of vectors is extremely convenient computationally speaking. In particular, when picking basis of \(V\text{,}\) we will look for one consisting of orthogonal vectors. Not surprisingly, this is called an orthogonal basis.

Definition 4.2.5. Orthogonal and orthonormal bases.

Let \((V,\langle \ , \rangle)\) be an inner product space. An orthogonal basis (resp., orthonormal basis) of \(V\) is a basis \(B\) that is orthogonal (resp., orthonormal) as a set.
Theorem 4.2.7 serves as a first example illustrating the virtue of orthogonal and orthonormal base. We preface it with a mantra conveying the general principle.

Example 4.2.8.

Consider the inner product space \(V=\R^2\) with the dot product.
  1. Verify that \(B'=\{\boldv_1=(\sqrt{3}/2,1/2), \boldv_2=(-1/2,\sqrt{3}/2)\}\) is an orthonormal basis of \(\R^2\text{.}\)
  2. Let \(\boldv=(4,2)\text{.}\) Find the scalars \(c_1, c_2\in \R\) such that \(\boldv=c_1\boldv_1+c_2\boldv_2\text{.}\)
  3. Verify that \(\norm{\boldv}=\sqrt{c_1^2+c_2^2}\text{.}\)
Solution.
  1. Easily verified.
  2. Using Theorem 4.2.7 we compute
    \begin{align*} c_1 \amp =\boldv\cdot \boldv_1=2\sqrt{3}+1 \\ c_2\amp= \boldv\cdot \boldv_2=\sqrt{3}-2 \text{.} \end{align*}
  3. Computing directly yields \(\norm{\boldv}=\sqrt{20}=2\sqrt{5}\text{.}\) Using the generalized Pythagorean theorem we have
    \begin{align*} \norm{\boldv} \amp= \sqrt{(2\sqrt{3}+1)^2+(\sqrt{3}-2)^2} \\ \amp=\sqrt{(12+4\sqrt{3}+1)+(3-4\sqrt{3}+4)} \\ \amp = \sqrt{20}=2\sqrt{5}\text{,} \end{align*}
    as desired.
Theorem 4.2.7 gives a first glimpse how working with orthogonal and orthonormal bases can make life easier. The question remains, however: Can we always find an orthonormal basis? The Gram-Schmidt procedure gives an affirmative answer to this question, at least in the finite-dimensional case, as it provides a method of converting an arbitrary basis into an orthogonal one.
  1. See Procedure 4.2.9 and its proof.
  2. The orthogonal set \(S\) is linearly independent by Theorem 4.2.2. Let \(S=\{\boldv_1,\boldv_2,\dots, \boldv_r\}\) be the distinct elements of \(S\text{.}\) (We must have \(r\leq n\) since \(S\) is linearly independent.) By Theorem 3.7.8 we can extend \(S\) to a basis \(B=\{\boldv_1,\dots, \boldv_r, \boldv_{r+1}, \dots, \boldv_n\}\text{.}\) It is easy to see that when we apply the Gram-Schmidt procedure to \(B\text{,}\) the first \(r\) vectors are left unchanged, as they are already pairwise orthogonal. Thus Gram-Schmidt returns an orthogonal basis of the form \(B'=\{\boldv_1,\dots, \boldv_r, \boldw_{r+1}, \dots, \boldw_n\}\text{,}\) as desired.

Exercises 4.2.3 Exercises

Webwork Exercises

1.
Let \({ \mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3 }\) be an orthonormal basis for an inner product space \(V\text{.}\) If
\begin{equation*} \mathbf{v} = a \mathbf{u}_1 + b \mathbf{u}_2 + c \mathbf{u}_3 \end{equation*}
is so that \(\|\mathbf{v}\| = 78\text{,}\) \(\mathbf{v}\) is orthogonal to \(\mathbf{u}_3\text{,}\) and \(\langle \mathbf{v},\mathbf{u}_2 \rangle = -78\text{,}\) find the possible values for \(a\text{,}\) \(b\text{,}\) and \(c\text{.}\)
\(a =\) , \(b =\) , \(c =\)
Answer 1.
\(0\)
Answer 2.
\(-78\)
Answer 3.
\(0\)
Solution.
Solution: One checks by direct computation that
\(a = 0\text{,}\) \(b = -78\text{,}\) \(c = 0\)
must hold.
2.
Let
\begin{equation*} \vec{x} = \left[\begin{array}{c} 2\cr -2\cr 4 \end{array}\right] \ \mbox{ and } \ \vec{y} = \left[\begin{array}{c} 1\cr -5\cr 6 \end{array}\right]. \end{equation*}
Use the Gram-Schmidt process to determine an orthonormal basis for the subspace of \({\mathbb R}^3\) spanned by \(\vec{x}\) and \(\vec{y}\text{.}\)
\(\Bigg\lbrace\) (3 × 1 array), (3 × 1 array) \(\Bigg\rbrace.\)
3.
Let
\begin{equation*} \vec{x} = \left[\begin{array}{c} 3\cr -1\cr 4\cr 0 \end{array}\right], \ \vec{y} = \left[\begin{array}{c} 5.5\cr -0.5\cr -1\cr 1 \end{array}\right], \ \vec{z} = \left[\begin{array}{c} 5\cr -5\cr 1.5\cr -28.5 \end{array}\right]. \end{equation*}
Use the Gram-Schmidt process to determine an orthonormal basis for the subspace of \({\mathbb R}^4\) spanned by \(\vec{x}\text{,}\) \(\vec{y}\text{,}\) and \(\vec{z}\text{.}\)
\(\Bigg\lbrace\) (4 × 1 array), (4 × 1 array), (4 × 1 array) \(\Bigg\rbrace.\)
4.
Let
\begin{equation*} A = \left[\begin{array}{cc} 3 \amp -6\cr 2 \amp -5\cr 1 \amp 0 \end{array}\right]. \end{equation*}
Find an orthonormal basis of the image of \(A\text{.}\)
\(\Bigg\lbrace\) (3 × 1 array), (3 × 1 array) \(\Bigg\rbrace.\)
5.
Let
\begin{equation*} A = \left[\begin{array}{cccc} -12 \amp -6 \amp 8 \amp -9\cr 9 \amp 6 \amp -6 \amp 9 \end{array}\right]. \end{equation*}
Find an orthonormal basis of the kernel of \(A\text{.}\)
\(\Bigg\lbrace\) (4 × 1 array), (4 × 1 array) \(\Bigg\rbrace.\)

6.

The vectors
\begin{equation*} \boldv_1=(1,1,1,1), \boldv_2=(1,-1,1,-1), \boldv_3=(1,1,-1,-1), \boldv_4=(1,-1,-1,1) \end{equation*}
are pairwise orthogonal with respect to the dot product, as is easily verified. For each \(\boldv\) below, find the scalars \(c_i\) such that
\begin{equation*} \boldv=c_1\boldv_1+c_2\boldv_2+c_3\boldv_3+c_4\boldv_4\text{.} \end{equation*}
  1. \(\displaystyle \boldv=(3,0,-1,0)\)
  2. \(\displaystyle \boldv=(1,2,0,1)\)
  3. \(\boldv=(a,b,c,d)\) (Your answer will be expressed in terms of \(a,b,c\text{,}\) and \(d\text{.}\) )

7.

Consider the inner product space given by \(\R^4\) together with the dot product. Construct an orthogonal basis of \(\R^4\) containing \(\boldv_1=(1,1,1,1)\) following the steps below.
  1. Produce a vector \(\boldv_2\) orthogonal to \(\boldv_1\) by inspection.
  2. Produce a vector \(\boldv_3\) orthogonal to \(\boldv_1\) and \(\boldv_2\) by setting up an appropriate matrix equation of the form \(A\boldx=\boldzero\) and finding a nontrivial solution. (Use Theorem 4.1.9.)
  3. Produce a vector \(\boldv_4\) orthogonal to \(\boldv_1, \boldv_2, \boldv_3\) by setting up an appropriate matrix equation of the form \(B\boldx=\boldzero\) and finding a nontrivial solution. (Use Theorem 4.1.9.)

8.

Consider the inner product space given by \(V=\R^3\) together with the dot product. Let \(W\) be the plane with defining equation \(x+2y-z=0\text{.}\) Compute an orthogonal basis of \(W\text{,}\) and then extend this to an orthogonal basis of \(\R^3\text{.}\)
Hint.
You do not have to use Gram-Schmidt here, but can proceed using a combination of inspection, your geometric understanding of \(W\text{,}\) and/or along similar lines of Exercise 4.2.3.7.

9.

Consider the inner product space given by \(\R^3\) together with the weighted dot product
\begin{equation*} \langle (x_1,x_2,x_3),(y_1,y_2,y_3)\rangle=x_1y_1+2x_2y_2+3x_3y_3 \end{equation*}
Use the Gram-Schmidt procedure to convert the basis \(B=\{(1,1,0),(0,1,1), (1,0,1)\}\) into a basis that is orthogonal with respect to this weighted dot product.

10.

Consider the vector space \(V=C([0,1])\) with the integral inner product. Let \(B=\{x^2,x^4,x^6\}\text{,}\) and define \(W=\Span B\text{.}\) Apply Gram-Schmidt to \(B\) to obtain an orthogonal basis of \(W\text{.}\)

11.

Consider the vector space \(V=P_2\) with the evaluation at \(-1, 0, 1\) inner product:
\begin{equation*} \angvec{p(x),q(x)}=p(-1)q(-1)+p(0)q(0)+p(1)q(1)\text{.} \end{equation*}
Apply Gram-Schmidt to the standard basis of \(P_2\) to obtain an orthogonal basis of \(P_2\text{.}\)

12.

Consider the inner product space \(\R^4\) together with the dot product.
\begin{equation*} W=\{\boldx\in \R^4\colon \boldx\cdot (1,2,-1,-1)=0\}\text{.} \end{equation*}
  1. Show that \(W\) is a subspace of \(\R^4\) by finding a matrix \(A\) for which \(W=\NS A\text{.}\)
  2. Use (a) and an appropriate fundamental space algorithm to find a basis for \(W\text{.}\)
  3. Use Gram-Schmidt to convert your basis in (b) to an orthgonal basis of \(W\text{.}\)

13.

Let \((V,\langle , \rangle )\) be an inner produce space. Prove: if \(\angvec{\boldv,\ \boldw}=0\text{,}\) then
\begin{equation*} \norm{\boldv+\boldw}^2=\norm{\boldv}^2+\norm{\boldw}^2\text{.} \end{equation*}
This result can be thought of as the Pythagorean theorem for general inner product spaces.

14.

Let \((V, \langle , \rangle )\) be an inner product space, and suppose \(B=\{\boldv_1, \boldv_2, \dots, \boldv_n\}\) is an orthonormal basis of \(V\text{.}\) Suppose \(\boldv, \boldw\in V\) satisfy
\begin{equation*} \boldv=\sum_{i=1}^nc_i\boldv_i, \boldw=\sum_{i=1}^nd_i\boldv_i\text{.} \end{equation*}
  1. Prove:
    \begin{equation*} \langle \boldv, \boldw\rangle =\sum_{i=1}^nc_id_i\text{.} \end{equation*}
  2. Prove:
    \begin{equation*} \norm{\boldv}=\sqrt{\sum_{i=1}^nc_i^2}\text{.} \end{equation*}