Suppose we wish to find a line \(\ell\colon y=mx+b\) that best fits (in the least-square sense) the following data points: \(P_1=(-3,1), P_2=(1,2), P_3=(2,3)\text{.}\) Following the discussion above, we seek a solution \(\boldx=(m,b)\) to the matrix equation \(A\boldx=\boldy\text{,}\) where
\begin{equation*}
\boldx=\begin{bmatrix}m \\ b \end{bmatrix}, A=\begin{amatrix}[rr]-3\amp 1\\ 1\amp 1\\ 2\amp 1 \end{amatrix} , \boldy=\begin{bmatrix}1\\ 2\\ 3 \end{bmatrix}\text{.}
\end{equation*}
Using Gaussian elimination, we see easily that this equation has no solution: equivalently,
\(\boldy\notin W=\CS A\text{.}\) Accordingly, we compute
\(\hat{\boldy}=\proj{\boldy}{W}\) and find a solution to
\(A\hat{\boldx}=\hat{\boldy}\text{.}\) Conveniently, the set
\(B=\{(-3,2,1), (1,1,1)\}\) is already an
orthogonal basis of
\(W=\CS A\text{,}\) allowing us to use
(4.3.2):
\begin{equation*}
\hat{\boldy}=\frac{\boldy\cdot (-3,1,2)}{(-3,2,1)\cdot (-3,1,2)}(-3,1,2)+\frac{\boldy\cdot(1,1,1)}{(1,1,1)\cdot (1,1,1)}(1,1,1)=\frac{1}{14}(13, 33, 38)\text{.}
\end{equation*}
Lastly, solving \(A\hat{\boldx}=\hat{\boldy}\) yields \((m,b)=\hat{\boldx}=(5/14, 2)\text{,}\) and we conclude the line \(\ell\colon y=(5/14)x+2\) is the one that best fits the data in the least-squares sense.